Both variance and standard deviation are the most commonly used terms in probability theory and statistics to better describe the measures of spread around a data set. Both give numerical measures of the spread of a data set around the mean. The mean is simply the arithmetic average of a range of values in a data set whereas the variance measures how far the numbers are dispersed around the mean meaning the average of the squared deviations from the mean. The standard deviation is a measure to calculate the amount of dispersion of values of a given data set. It is simply the square root of the variance. While many contrast the two mathematical concepts, we hereby present an unbiased comparison between variance and standard deviation to better understand the terms.
What is Variance?
The variance is simply defined as a measure of variability of values around their arithmetic mean. In simple terms, variance is the mean squared deviation whereas mean is the average of all values in a given data set. The notation for the variance of a variable is “σ2” (lower-case sigma) or sigma squared. It is calculated by subtracting the mean from each value in a give data set and squaring their differences together to obtain positive values and finally dividing the sum of their squares by the number of values.
If M = mean, x = each value in the data set, and n = number of values in the data set, then
σ2 = ∑ (x – M)2/ n
What is Standard Deviation?
The standard deviation is simply defined as the measure of dispersion of the values in a given data set from their mean. It measures the spread of data around the mean is calculated as the square root of the variance. The stan σ dard deviation is symbolized by the Greek letter sigma “σ” as in lower case sigma. The standard deviation is expressed in the same unit as the mean value which isn’t necessarily the case with variance. It is mainly used as a tool in trading and investing strategies.
If M = mean, x = a values in a data set, and n = number of values then,
σ = √∑ (x – M)2/ n
Difference between Variance and Standard Deviation
Meaning of Variance and Standard Deviation
Variance simply means how far the numbers are spread in a given data set from their average value. In statistics, variance is a measure of variability of numbers around their arithmetic mean. It is a numerical value which quantifies the average degree to which the values of a set of data differ from their mean. Standard deviation, on the other hand, is a measure of dispersion of the values of a data set from their mean. It is a common term in statistical theory to calculate central tendency.
Measure
Variance simply measures the dispersion of a data set. In technical terms, variation is the average squared differences of the values in a data set from the mean. It is calculated by first taking the difference between each value in the set and mean and squaring the differences to make the values positive, and finally calculating the average of squares to render the variance. Standard deviation simply measures the spread of data around the mean and is calculated by simply taking the square root of the variance. The value of standard deviation is always a non-negative value.
Calculation
Both variance and standard deviation are calculated around the mean. The variance is symbolized by “S2” and the standard deviation – the square root of the variance is symbolized as “S”. For example, for the data set 5, 7, 3, and 7, the total would be 22, which would be further divided by the number of data points (4, in this case), resulting in a mean (M) of 5.5. Here, M = 5.5 and number of data point (n) = 4.
The variance is calculated as:
S2 = (5 – 5.5)2 + (7 – 5.5)2 + (3 – 5.5)2 + (7 – 5.5)2 / 4
= 0.25 + 2.25 + 6.25 + 2.25/ 4
= 11/4 = 2.75
The Standard Deviation is calculated by taking the square root of the variance.
S = √2.75 = 1.658
Applications of Variance and Standard Deviation
The variance combines all the values in a set of data to quantify the measure of spread. So bigger the spread, more the variation which results in a larger gap between the values in the data set. Variance is primarily used for statistical probability distribution to measure volatility from the mean and volatility is one of the measures of risk analysis which might help investors to determine the risk in investment portfolios. It is also one of the key aspects of asset allocation. Standard deviation, on the other hand, can be used in a wide range of applications such as in finance sector as a measure of market and security volatility.
Variance vs. Standard Deviation: Comparison Chart
Summary of Variance and Standard Deviation
Both variance and standard deviation are the most common mathematical concepts used in statistics and probability theory as the measures of spread. Variance is a measure of how far the values are spread in a given data set from their arithmetic mean, whereas standard deviation is a measure of dispersion of values relative to the mean. Variance is calculated as average squared deviation of each value from the mean in a data set, whereas standard deviation is simply the square root of the variance. The standard deviation is measured in the same unit as the mean, whereas variance is measured in squared unit of the mean. Both are used for different purpose. Variance is more like a mathematical term whereas standard deviation is mainly used to describe the variability of the data.