Fernando R. answered 03/20/17
Tutor
New to Wyzant
Software Developer at Microsoft with a Tutoring Hobby
Standard deviation is one of the measures used to show how much data is centered around the mean. If you look at the equation for standard deviation, it is the square root of the variance, which itself is the average of (data-mean)^2 of all the values of data from the mean. Both variance and standard deviation are measures of this "spread", but standard deviation has a benefit of being in the same units as the data is. It is an important variable in any distribution that determines the shape of that distribution, such as telling you how flat or narrow a bell curve is.
Another common measure is "average absolute deviation" which is the average distance of all values from the mean, not squared. One benefit of standard deviation is that it tells you how likely it is to be within a certain distance of the mean. There are a few equations that express the probability of data points being within a certain number of standard deviations from the mean.
To give a common example, IQ tests are constructed so that the mean of a population is 100, with a standard deviation of 10. If we assume it is a bell curve, if you get an IQ score of 120, you are two standard deviations above the mean, which corresponds to a percentile of about 97.7%, meaning you are higher than 97.7 percent of other people.
It is a quick way to determine just how volatile or uneven a set of data is, with a larger standard deviation showing that data is more spread out, and a smaller one pointing to a tighter grouping of data. There are a few mathematical properties of standard deviation, most having to do with the probability of there being data outside of a certain number of standard deviations of the mean.
Another common measure is "average absolute deviation" which is the average distance of all values from the mean, not squared. One benefit of standard deviation is that it tells you how likely it is to be within a certain distance of the mean. There are a few equations that express the probability of data points being within a certain number of standard deviations from the mean.
To give a common example, IQ tests are constructed so that the mean of a population is 100, with a standard deviation of 10. If we assume it is a bell curve, if you get an IQ score of 120, you are two standard deviations above the mean, which corresponds to a percentile of about 97.7%, meaning you are higher than 97.7 percent of other people.
It is a quick way to determine just how volatile or uneven a set of data is, with a larger standard deviation showing that data is more spread out, and a smaller one pointing to a tighter grouping of data. There are a few mathematical properties of standard deviation, most having to do with the probability of there being data outside of a certain number of standard deviations of the mean.