How do you find the standard deviation for a 95% confidence interval given a mean of 135?

1 Answer
Jun 26, 2018

Insufficient data, or poorly-worded question! The SD is calculated from the data variance around the Mean.

Explanation:

The Confidence Interval can be anything that you want it to be - it simply sets the bounds applied by calculation with the SD to express the range about the mean. IT is determined by the data and the user. It does not determine the standard deviation of the data.

When the population standard deviation is known, the formula for a confidence interval (CI) for a population mean is:

CI = #barx"+/-" z^* xx sigma/(sqrt(n))#

where #barx# is the sample mean and #sigma# is the population standard deviation, n is the sample size, and z represents the appropriate z-value from the standard normal distribution for your desired confidence level. z* is 1.96 for a 95% confidence interval.
http://www.dummies.com/education/math/statistics/how-to-calculate-a-confidence-interval-for-a-population-mean-when-you-know-its-standard-deviation/

The sample SD is just a value you compute from a sample of data. It's not done often, but it is certainly possible to compute a CI for a SD. A free GraphPad QuickCalc does the work for you.
https://www.graphpad.com/support/faqid/1381/

Interpreting the CI of the SD is straightforward. If you assume that your data were randomly and independently sampled from a Gaussian distribution, you can be 95% sure that the CI computed from the sample SD contains the true population SD.
https://www.ncss.com/wp-content/themes/ncss/pdf/Procedures/PASS/Confidence_Intervals_for_One_Standard_Deviation_using_Standard_Deviation.pdf