Michael H. answered 06/04/19
Ph.D. in statistics and teaching experience at several universities
A real symmetric matrix can be matrix can be diagonalized by an orthogonal matrix. Suppose G is an orthogonal matrix, i.e. a square matrix satisfying GTG = I = the identity matrix.
Suppose var(X) = S is the n × n matrix of covariances between entries in the n × 1 column vector X.
Then V = var(GX) = G S GT = the matrix of covariances of the column vector GX.
See if you can find out for which orthogonal matrix G this matrix V is a diagonal matrix. It entries will be independent normally distributed random variables with expectation 0.
Note that the sum of squares of the entries of GX is is the same as the sum of squares of the entries of G, since that is readily seen to follow from the equality GTG = I.
This reduces the problem to one involving independent normals with expectation 0. Whether they have equal variances depends on the Toeplitz matrix you started with. If they do, then you have a constant multiple of a chi-square-distributed random variable.