Search 72,582 tutors
FIND TUTORS
Ask a question
0 0

Discuss deviation and dispersion

Tutors, please sign in to answer this question.

1 Answer

Not sure what the context here is, but deviation in plain terms is a measure of how far an observed value is from the known or actual value. Dispersion is a measure of how far your observed values deviated from each other.

You can think of it like a dartboard. If you are using three darts and aiming for the bullseye every time, then the distance your darts are from the bullseye would be the amount of your deviation from it. The dispersion would be the distance your darts are from each other.

If you missed the bullseye every time but got all of your darts in the same part of the same edge of the board, then the amount of dispersion would be small but your deviation from the bullseye would be large.

If you missed the bullseye and hit areas of the dartboard on opposite sides or as far apart as possible, you would  have the largest amount of dispersion of your darts as well as deviation from the bullseye.

And if you hit the bullseye all three times then you would have the smallest amount of deviation and dispersion.

 

This analogy is sufficient if you were looking for simplified definitions but not suitable as any type of rigorous discussion of the statistics of probability distributions. Hope it helps you out.

 

Woodbridge Deviation Dispersion tutors