In probability theory and statistics, the variance is used as a measure of how far a set of numbers are spread out from each other. It is one of several descriptors of a probability distribution, describing how far the numbers lie from the mean (expected value). In particular, the variance is one of the moments of a distribution. In that context, it forms part of a systematic approach to distinguishing between probability distributions. While other such approaches have been developed, those based on moments are advantageous in terms of mathematical and computational simplicity.

$\Large {Var}(X) = {E}[(X - \mu)^2]$
bag
math_public
created
Tue, 01 Feb 2011 21:59:35 GMT
creator
dirkjan
modified
Tue, 01 Feb 2011 21:59:35 GMT
modifier
dirkjan
creator
dirkjan