Asked • 06/01/19

How to determine if Standard Deviation is high/low?

I have derived the following response time data for a performance test I am running: Min - 8sec Max - 284sec Average - 28sec Standard Deviation - 27sec What does the standard deviation say about the response time data distribution? When you say low/high standard deviation, what does this actually mean? Is this in comparison to the Average/Min/Max? I know what standard deviation is and how it's computed. I'm just not sure how to tell if it is high or low.

1 Expert Answer

By:

Still looking for help? Get the right answer, fast.

Ask a question for free

Get a free answer to a quick problem.
Most questions answered within 4 hours.

OR

Find an Online Tutor Now

Choose an expert and meet online. No packages or subscriptions, pay only for the time you need.