# Of the range and the standard deviation, which is more widely used in statistical analysis, and why?

Feb 9, 2015

Standard deviation is most widely used.

Range simply gives the difference between lowest and highest value, and a few extreme values will alter the range excessively.

The standard deviation $\sigma$ tells you where most of the values will be, and in a normal distribution 68% of all values will be within one standard deviation from the mean $\mu$, and 95% will be within two standard deviations of the mean.

Example:
You have a filling machine that fills kilogram bags of sugar. It will not fill exactly $1000 g$ every time, the standard deviation is $10 g$.
Then you know, that 68% is between $990 \mathmr{and} 1010 g$, and 95% between $980 \mathmr{and} 1020 g$, a total span of $20 g$ or $40 g$ respectively.

Every now and again a bag will be far over-filled (say $1100 g$) and sometimes a bag will end up empty ($0 g$), so the range will be a total of $1100 g$.

You may decide which of the two gives a better idea of the spread in this distribution.