Hi,
I've asked the same question on two forum with no answer that I can understand. Googling for "standard deviation" really confused me so I'm hoping for a clear understandable ( to me ) answer somewhere.
I have looked at the WXQC mailing list but did not register and post the question there, but I think that may be a good place to try.
Good luck with this question. If you find something, please post it here.
Regards, Bob
In it's most simplistic meaning, standard deviation measures the spread (the difference between the high and low) of values within a data set. The greater the standard devation, the wider the range of values in the data set.
I say simplistic because the implication is that you figure it out by subtracting the lowest value from the highest but it doesn't quite work that way. Because the values in a data set may not be uniformly dispersed (ie most may be bunched up near the low while only a few are near the high as an example) figuring it out that way wouldn't be truly representative of the spead of the values. So to compensate and give a better representation of the values a slightly more invovled (but still very simple) calculation is made.
In it's more formal mathematical definition, standard deviation is the root mean square deviation of a set of values from their mean value. You figure out a weighted average deviation (the root mean square deviation) and compare it to the raw average.
To figure it out:
1) first find the mean (average) of the values in the set.
2) Then calculate the deviation for each value from that mean. In other words, subtract then mean you figured out in the first step from the raw value.
3) Square each of the deviations from step 2.
4) Get them mean (average) of the values from step 3.
5) Take the square root of that mean and you have your standard deviation.
At least I think that's right, it's been a long, long time since I took statistical analysis and I didn't like it even back then.
