Edward C. answered 04/04/15
Tutor
5.0
(438)
Caltech Grad for math tutoring: Algebra through Calculus
The margin of error is equal to (z*)*σ/sqrt(n) where z* is the critical z-value for the given confidence level, σ is the population standard deviation, and n is the number of samples. So
M = (z*)*σ/sqrt(n)
sqrt(n) = (z*)*σ / M
n = [(z*)*σ / M]^2
If M is decreased by a factor of 6 then the new M = M/6 and the new n2 required is
n2 = [(z*)*σ / (M/6)]^2 = [(z*)*σ*(6/M)]^2 = 36*[(z*)*σ/M]^2 = 36*n
So the sample size must be increased by a factor of 36.
This was a lengthy derivation that you don't need to go thru every time, but it's instructional to see it once. The main thing to remember is that the margin of error gets smaller in the same ratio that n^2 gets larger.