Talk:Sum of normally distributed random variables

Latest comment: 2 years ago by Alexor65 in topic Misleading explanation

Hi! Recently articles on Wikipedia get so good that they can be used to base a literature review on. However, very often (like in this page) "real" literature references are missing. I assume this proof was done by someone else than the author. I would like to see references. Kind regards, Steven

In this case, the problem is which of the many references to use? One could just say "See any standard textbook on the subject", and it would be essentially correct, but probably doesn't count as a "reference". Michael Hardy 18:39, 3 August 2006 (UTC)Reply

Product? edit

What about the product of normally distributed random variables? I found a document that discusses it which says that if   then

 .

But I'm having trouble finding the mean and variance of this distribution. (I suppose it might not even be normally distributed.) —Ben FrantzDale 04:57, 31 January 2007 (UTC)Reply

This may have the answer: http://mathworld.wolfram.com/NormalProductDistribution.html —Ben FrantzDale 05:27, 31 January 2007 (UTC)Reply


The case where   is derived in section 1.4 of Glen Cowan's book "Statistical Data Analysis" (Oxford Science Publications, p. 15) as an example of transformations of random variables. The integral transformation is a Mellin convolution. — Preceding unsigned comment added by 136.152.142.21 (talk) 18:35, 24 March 2015 (UTC)Reply

Case if the variables are correlated edit

For the case where the variables are correlated, I have given an outline of how to proceed with the derivation. Velocidex (talk) 02:03, 1 July 2008 (UTC)Reply


You should also provide the covariance matrix, because the correlation coefficients are not clear. How do you get the term 2ρσxσy?. You should get 2ρ. Except if the cross correlation is ρσxσy. Energon (talk) 13:27, 16 June 2009 (UTC)Reply

Also the article says 'whenever ρ < 1, then the standard deviation is less than the sum of the standard deviations of X and Y' - but the formula implies it should be greater (unless ρ < -1). Something wrong here. Ben Finn (talk) 20:09, 19 August 2011 (UTC)Reply

Integrating the Dirac delta function edit

In the section Proof using convolutions, we might want to include a note that the Dirac function is constrained to satisfy the identity

 

and can thus be dropped. 192.91.171.42 (talk) 21:00, 14 April 2009 (UTC)Reply

Geometric proof edit

Pretty sure there's an error here.   should read   and   should read  . —Preceding unsigned comment added by 128.237.245.76 (talk) 13:15, 4 October 2010 (UTC)Reply

Variance of mean edit

This seems like a good page to discuss the variance of the mean. Particularly, the variance of the sum is the sum of the variances and so the variance of the mean is the sum of the variances divided by  . Equivalently, the standard deviation of the mean is the standard deviation of the sum divided by n. —Ben FrantzDale (talk) 17:56, 13 June 2011 (UTC)Reply

I don't think it belongs here -- I'll put it into Sample mean and sample covariance. I'm surprised it's not already there. Duoduoduo (talk) 18:58, 13 June 2011 (UTC)Reply

Correlated random variables edit

The text says that when rho < 1, the standard deviation is less than than the sum of the standards deviations of X and Y. This should instead read rho < 0. — Preceding unsigned comment added by 210.11.58.37 (talk) 05:47, 14 November 2011 (UTC)Reply

Agreed. Also, the statement "This is perhaps the simplest demonstration of the principle of diversification" is misleading, since diversification comes not from the 2 rho s1s2 term, it comes from the concept of sample means, where variance of sample mean is lowered as long as rho < 1. Larger samples implies lower variance (by 1/n for rho=0). Zetapack (talk) 20:23, 6 February 2015 (UTC)Reply

No necessicity for proofs here edit

The proofs of the statements of the article should not be included in the article but rather it should be in the sources that are used to verify the validity of the article. At least they should be folded (so that a click is needed to unfold them) — Preceding unsigned comment added by 130.75.31.228 (talk) 17:51, 24 January 2020 (UTC)Reply

possible to use std dev? edit

I find the use of variance papers over what is actually going on...

 
 
 
 

this kind of logic makes the interaction more clear (and ties in with the correlation example at the bottom of the page) — Preceding unsigned comment added by 2A02:1206:45A8:590:1432:5643:CEA0:DB98 (talk) 18:39, 12 March 2021 (UTC)Reply

Misleading explanation edit

Nowadays it is written "Let X and Y be independent random variables that are normally distributed (...) then their sum is also normally distributed (...) the assumption that X and Y are independent cannot be dropped (...) The result about the mean holds in all cases, while the result for the variance (...)." This is bad explained. In particular X and -X (which are not independent) have sum which - though having mean 0 which is the sum of the means - in not normally distributed. The sum of not independent random variables may be not normally distributed, while the text allows to suppose it is, and with mean the sum of the means. — Preceding unsigned comment added by Alexor65 (talkcontribs) 22:18, 11 December 2021 (UTC)Reply