Assuming we have two random variables X and Y and X is a standard normally distribution with mean equals to 0 and standard deviation equals to 1, and Y = X^2. Clearly if we know the value of X then we can compute the value of Y. Hence, X and Y are not independent. HOWEVER, if we compute the correlation coefficient of X and Y, Cor(X,Y) = Cov(X,Y) / [(Var(X) * Var(Y)^0.5], we will have the correlation coefficient Cor(X,Y)=0 because Cov(X,Y) = 0 (Note, there are many of proofs online).
Why does zero correlation not imply independence?
Although independence implies zero correlation, zero correlation does not necessarily imply independence.
While I understand the concept, I can't imagine a real world situation with zero correlation that did not also have independence.
Can someone please give me an example so I can better understand this phenomenon?
Follow
1
Add comment
More
Report
1 Expert Answer
Still looking for help? Get the right answer, fast.
Ask a question for free
Get a free answer to a quick problem.
Most questions answered within 4 hours.
OR
Find an Online Tutor Now
Choose an expert and meet online. No packages or subscriptions, pay only for the time you need.