Ask a question
0 0

in a population with a mean of 50, a score of 55 would have a deviation of ?

Tutors, please sign in to answer this question.

1 Answer

The above article defines deviation as 
In mathematics and statistics, deviation is a measure of difference between the observed value of a variable and some other value, often that variable's mean. The sign of the deviation (positive or negative), reports the direction of that difference (the deviation is positive when the observed value exceeds the reference value).
By that definition, the deviation for a score of 55 in a population with mean 50 will be simply 55 - 50 = 5