Tom K. answered 06/07/20
Knowledgeable and Friendly Math and Statistics Tutor
I wasn't going to answer this, but it has remained unsolved for a while, so here goes!
Rules to keep in mind: the variance of the sum of independent random variables equals the sum of the variances.
var(kX) = k^2 var(X)
Thus, var(1/2(Y1 + Y2)) = var(1/2 Y1) + var(1/2Y2) = 1/4 var(Y1) + 1/4 var(Y2) = 1/4σ2 + 1/4σ2 = 1/2σ2
var(1/4Y1 + (Y2+Y3+...+Yn-1)/(2(n-2)) + 1/4Yn = var(1/4Y1) + var(Y2+Y3+...+Yn-1)/(2(n-2)) + var(1/4Yn) =
1/16 var(Y1) + 1/(4(n-2)2) (var(Y2) + var(Y3)...+var(Yn-1)) + 1/16 var (Yn) = 1/16σ2 + (n-2)/4(n-2)2)σ2 + 1/16σ2 = (1/8 + 1/(4(n-2))σ2 =
n/(8(n-2))σ2
var(Y-bar) = var(Y1 + Y2 + ... + YN)/n) = 1/n2 (var(Y1) + var(Y2) + ... + var(Yn) )= 1/n2 * nσ2 = σ2/n
- We can easily show that they are all unbiased - have expected value µ
E(1/2(Y1 + Y2)) = E(1/2Y1) + E(1/2Y2) = 1/2 E(Y1) + 1/2 E(Y2) =
1/2(µ) + 1/2(µ ) = µ
E(1/4Y1 + (Y2+Y3+...+Yn-1)/(2(n-2)) + 1/4Yn) = E(1/4Y1) + E((Y2+Y3+...+Yn-1)/(2(n-2)) + E(1/4Yn) =
1/4E(Y1) + 1/(2(n-2))(E(Y2)+E(Y3)+...+E(Yn-1)) + 1/4E(Yn) =
1/4µ + n-2/(2(n-2))µ + 1/4µ =
1/4µ + 1/2µ + 1/4µ = µ
E(Y-bar) = E(Y1+Y2+...+Yn)/n) = 1/n(E(Y1)+E(Y2) + ... + E(Yn)) = 1 /n * nµ = µ
To compare estimators:
to show, let's look only at the constant being multiplied by σ2; comparing the first and third estimator, 1/2 > 1/n for n >= 3; comparing the second and third estimator, n/(8(n-2)) is greater than or equal to 1/n, as n/(8(n-2)) - 1/n = (n^2-8n+16)/(8(n-2)n) = (n-4)^2/(8(n-2)n) > 0 for n not equal to 4 and equal to 0 for n = 4.
(the second estimator only makes sense for n >= 3; for n=2, the first and third estimator are the same; the first estimator only makes sense for n >= 2).
As all are unbiased, the third estimator is preferred, as it has less variance than that of the other estimators in all but the cases where they are actually the same estimator (n = 2 for the second and n = 4 for the third)