answersLogoWhite

0


Best Answer
Standard Deviation Explained:Standard deviation is a simple measure of width of a distribution of numbers (usually scores or measurements). It is the next high degree of sophistication of characterizing a bunch of number other than just giving the average ( or mean, median, mode). If you know the average, you do not know how tightly the numbers are clustered around the average, that is what the standard deviation tells you. It is one definition, the most common definition, of the width of a distribution.

(There are, of course, many other additional characterizations.)

It is important because it tells you if the average is a very useful quantity to use to interpret the data. If someone tells you that the average person your age dies in 50 years, that seems important, but if someone says that the average person dies in 50 years, give or take 20 years, suddenly you realize there is more to the story and maybe you should save more money, just in case. Well, the "give or take" part of that statement is very useful, but not well defined. If they say the life expectancy is 50 years with a standard deviation of 20 years, then that is perfectly defined mathematically. Standard deviation is a mathematical measure of the broadness of the distribution of data.

The following two data sets, A and B, have the same mean (average):

A: 48, 49, 50, 51, 52

B: 30, 40, 50, 60, 70

The distribution of the data about the mean in A is very narrow, whereas the distribution about the mean in B is broad. The S.D. gives us a quantification of the broadness of the distribution.

In normal distributions, about 68 percent of the data will fall within one S.D. on either side of the mean. About 96 percent of the data will fall with two S.D.

Let's say your teacher gives a test to one hundred kids and the test average is 80 points and the S.D. is 10. If the distribution is "normal," about 34 kids will score between 70 and 80, and about 34 kids will score between 80 and 90. We can also predict that about 14 kids will score between 90 and 100, and 14 will score below 70. That leaves four kids. They fall into two groups: they either totally bombed the test, or they got the extra credit question to boost their score over 100!

User Avatar

Wiki User

10y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What does the standard deviation of a data set tell us about the data and why is the standard deviation an important measure?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What is standard deviation used for?

Standard deviation is a measure of the spread of data.


Why do we need the standard deviation?

The standard deviation is a measure of the spread of data.


What does one standard deviation mean?

Standard deviation is a measure of variation from the mean of a data set. 1 standard deviation from the mean (which is usually + and - from mean) contains 68% of the data.


What is the standard deviation?

The standard deviation of a set of data is a measure of the spread of the observations. It is the square root of the mean squared deviations from the mean of the data.


Why standard deviation is best measure of dispersion?

standard deviation is best measure of dispersion because all the data distributions are nearer to the normal distribution.


What is the minimum data required for standard deviation?

The standard deviation is a measure of how spread out the numbers are. Three points is needed to calculate a statistically valid meaningful standard deviation.


What is meant by the standard deviation of a data set?

It is a measure of the spread of the data around its mean value.


Is it true that the standard deviation can equal zero in a data set?

The standard deviation is a measure of how much variation there is in a data set. It can be zero only if all the values are exactly the same - no variation.


What does the standard deviation tell us?

standard deviation is the square roots of variance, a measure of spread or variability of data . it is given by (variance)^1/2


What is the relation between quartile deviation and standard deviation?

Strictly speaking, none. A quartile deviation is a quick and easy method to get a measure of the spread which takes account of only some of the data. The standard deviation is a detailed measure which uses all the data. Also, because the standard deviation uses all the observations it can be unduly influenced by any outliers in the data. On the other hand, because the quartile deviation ignores the smallest 25% and the largest 25% of of the observations, there are no outliers.


Can a standard deviation of 4.34 be correct?

Yes, a standard deviation of 4.34 can be correct. Standard deviation is a measure of dispersion or variability in a data set. It represents the average amount by which individual data points deviate from the mean. Therefore, a standard deviation of 4.34 simply indicates that there is some variability in the data, with data points on average deviating by 4.34 units from the mean.


If the standard deviation is small the data is more dispersed?

No, if the standard deviation is small the data is less dispersed.