Asked • 05/29/19

sum of squares of dependent gaussian random variables?

Ok, so the Chi-Squared distribution with n degrees of freedom is the sum of the squares of n independent gaussian random variables. The trouble is, my gaussian random variables are not independent. They do however all have zero mean and the same variance. Supposing I have a covariance matrix - which again is not a diagonal matrix because they aren't independent, but all the elements along the diagonal are equal to each other because they have the same variance, and in fact the covariance matrix is a symmetric toeplitz matrix (and I'm not saying that this is important to the solution if there is one, but if it's a necessary property to get anywhere, by all means use that fact) - is there some way to decompose this sum of squares of these gaussian random variables into perhaps a sum of chi-squared random variables and possibly gaussian random variables? In other words, I can't directly just square them all and add them together and call it a chi squared distribution because a chi squared distribution is a sum of independent gaussian squares, and they aren't independent. I know how to find a linear transformation of the gaussian random variables which are n independent gaussians, but that's no help because they aren't the things being squared, you see.

1 Expert Answer

By:

Still looking for help? Get the right answer, fast.

Ask a question for free

Get a free answer to a quick problem.
Most questions answered within 4 hours.

OR

Find an Online Tutor Now

Choose an expert and meet online. No packages or subscriptions, pay only for the time you need.