Ryan M. answered 02/20/25
Statistics Pro Specializing in Probability, Data Analysis, & Inference
Hey Francisco,
Great question! Your tutor is right—standard deviation is just the standard error scaled by the square root of the sample size. Since your sample size is 138, the square root of 138 is about 11.75. So, to get SD, you’d multiply each SE value by 11.75.
For example, your first SE value is 3.05, so the SD for that column would be 3.05 × 11.75 = 35.79. You’d do the same for the rest of the SE values.
Hope that helps! Let me know if you need a hand with anything else—happy to help.