Skewness is a measure of the degree of asymmetry of a distribution. If the left tail (tail at small end of the distribution) is more pronounced than the right tail (tail at the large end of the distribution), the function is said to have negative skewness. If the reverse is true, it has positive skewness. If the two are equal, it has zero skewness.
Several types of skewness are defined, the terminology and notation of which are unfortunately rather confusing. "The" skewness of a distribution is defined to be
(1)
|
where is the th central moment. The notation is due to Karl Pearson, but the notations (Kenney and Keeping 1951, p. 27; Kenney and Keeping 1962, p. 99) and (due to R. A. Fisher) are also encountered (Kenney and Keeping 1951, p. 27; Kenney and Keeping 1962, p. 99; Abramowitz and Stegun 1972, p. 928). Abramowitz and Stegun (1972, p. 928) also confusingly refer to both and as "skewness." Skewness is implemented in the Wolfram Language as Skewness[dist].
An estimator for the skewness is
(2)
|
where the s are k-statistics (Kenney and Keeping 1962, p. 101). For a normal population with a sample size of , the variance of is
(3)
|
(Kendall et al. 1998).
The following table gives the skewness for a number of common distributions.
Several other forms of skewness are also defined. The momental skewness is defined by
(4)
|
The Pearson mode skewness is defined by
(5)
|
Pearson's skewness coefficients are defined by
(6)
|
and
(7)
|
The Bowley skewness (also known as quartile skewness coefficient) is defined by
(8)
|
where the s denote the interquartile ranges. The momental skewness is
(9)
|