Edward C. answered 04/01/15
Tutor
5.0
(438)
Caltech Grad for math tutoring: Algebra through Calculus
The answer is you cannot, unless you have more information about the shape of the distribution (more on that below). Consider the following 2 distributions -
X = {11 values equal to 50, 40 values equal to 100, and 49 values equal to 160}
Y = {49 values equal to 50, 40 values equal to 100, and 11 values equal to 160}
Both X and Y have the 10th percentile equal to 50, the 50th percentile equal to 100, and the 90th percentile equal to 160. But the mean of X is 123.9 and the mean of Y is 82.1. So you cannot determine the mean of an arbitrary distribution from 3 percentile values.
Now if you happen to know that the data come from a normal distribution then it's a different story. You can use a Z table to find what values of z correspond to the 10th, 50th and 90th percentile (they are z = -1.28, 0.00 and 1.28). These z values tell you how many standard deviations each percentile is away from the mean value. A normal distribution is symmetric about the 50th percentile value so the mean is equal to the 50th percentile value, which is 100 in your example. The 10th percentile is 1.28 times the standard deviation below the mean, so in your example (100 - 50) = 50 is 1.28 times the standard deviation which implies the standard deviation is equal to 50/1.28 = 39.06. Now here's the problem with the numbers in your example - they are not symmetric about the 50th percentile. If we calculate the standard deviation from the 90th percentile value we would get that (160 - 100) = 60 is 1.28 times the standard deviation which would imply that the standard deviation is 60/1.28 = 46.875. You could try averaging these two calculations to get an estimate of 42.97, or you could just say that since the distribution is not symmetric it cannot be normal so there's no way to calculate the mean or standard deviation.

Edward C.
tutor
Not that I know of. The nice thing about the normal distribution is that it retains its basic bell curve shape regardless of the values of the mean and the standard deviation - the curve just gets broader or narrower as the standard deviation changes.
But the lognormal distribution changes its shape pretty drastically as the standard deviation changes. When the standard deviation is small the lognormal curve is pretty similar to a normal distribution. But as the standard deviation increases the lognormal
distribution becomes more and more skewed to the left. So I don't think there's any single table that would scale properly for all lognormal distributions like there is for normal distributions.
Report
04/01/15
Rachel T.
04/01/15