F-distribution

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>


Fisher-Snedecor
Probability density function
F pdf.svg
Cumulative distribution function
F dist cdf.svg
Parameters d1, d2 > 0 deg. of freedom
Support x ∈ [0, +∞)
PDF \frac{\sqrt{\frac{(d_1\,x)^{d_1}\,\,d_2^{d_2}}


{(d_1\,x+d_2)^{d_1+d_2}}}}
{x\,\mathrm{B}\!\left(\frac{d_1}{2},\frac{d_2}{2}\right)}\!
CDF I_{\frac{d_1 x}{d_1 x + d_2}} \left(\tfrac{d_1}{2}, \tfrac{d_2}{2} \right)
Mean \frac{d_2}{d_2-2}\!
for d2 > 2
Mode \frac{d_1-2}{d_1}\;\frac{d_2}{d_2+2}
for d1 > 2
Variance \frac{2\,d_2^2\,(d_1+d_2-2)}{d_1 (d_2-2)^2 (d_2-4)}\!
for d2 > 4
Skewness \frac{(2 d_1 + d_2 - 2) \sqrt{8 (d_2-4)}}{(d_2-6) \sqrt{d_1 (d_1 + d_2 -2)}}\!
for d2 > 6
Ex. kurtosis see text
MGF does not exist, raw moments defined in text and in [1][2]
CF see text
Biologist and statistician Ronald Fisher

The F-distribution, also known as Snedecor's F distribution or the Fisher–Snedecor distribution (after Ronald Fisher and George W. Snedecor) is, in probability theory and statistics, a continuous probability distribution.


The F-distribution arises frequently as the null distribution of a test statistic, most notably in the analysis of variance; see F-test.[1][2][3][4]

Definition

If a random variable X has an F-distribution with parameters d1 and d2, we write X ~ F(d1, d2). Then the probability density function (pdf) for X is given by

 
\begin{align}
f(x; d_1,d_2) &= \frac{\sqrt{\frac{(d_1\,x)^{d_1}\,\,d_2^{d_2}} {(d_1\,x+d_2)^{d_1+d_2}}}} {x\,\mathrm{B}\!\left(\frac{d_1}{2},\frac{d_2}{2}\right)} \\
&=\frac{1}{\mathrm{B}\!\left(\frac{d_1}{2},\frac{d_2}{2}\right)} \left(\frac{d_1}{d_2}\right)^{\frac{d_1}{2}} x^{\frac{d_1}{2} - 1} \left(1+\frac{d_1}{d_2}\,x\right)^{-\frac{d_1+d_2}{2}}
\end{align}

for real x ≥ 0. Here \mathrm{B} is the beta function. In many applications, the parameters d1 and d2 are positive integers, but the distribution is well-defined for positive real values of these parameters.

The cumulative distribution function is

F(x; d_1,d_2)=I_{\frac{d_1 x}{d_1 x + d_2}}\left (\tfrac{d_1}{2}, \tfrac{d_2}{2} \right) ,

where I is the regularized incomplete beta function.

The expectation, variance, and other details about the F(d1, d2) are given in the sidebox; for d2 > 8, the excess kurtosis is

\gamma_2 = 12\frac{d_1(5d_2-22)(d_1+d_2-2)+(d_2-4)(d_2-2)^2}{d_1(d_2-6)(d_2-8)(d_1+d_2-2)}.

The k-th moment of an F(d1, d2) distribution exists and is finite only when 2k < d2 and it is equal to [5]

\mu _{X}(k) =\left( \frac{d_{2}}{d_{1}}\right)^{k}\frac{\Gamma \left(\tfrac{d_1}{2}+k\right) }{\Gamma \left(\tfrac{d_1}{2}\right) }\frac{\Gamma \left(\tfrac{d_2}{2}-k\right) }{\Gamma \left( \tfrac{d_2}{2}\right) }

The F-distribution is a particular parametrization of the beta prime distribution, which is also called the beta distribution of the second kind.

The characteristic function is listed incorrectly in many standard references (e.g., [2]). The correct expression [6] is

\varphi^F_{d_1, d_2}(s) = \frac{\Gamma(\frac{d_1+d_2}{2})}{\Gamma(\tfrac{d_2}{2})} U \! \left(\frac{d_1}{2},1-\frac{d_2}{2},-\frac{d_2}{d_1} \imath s \right)

where U(a, b, z) is the confluent hypergeometric function of the second kind.

Characterization

A random variate of the F-distribution with parameters d1 and d2 arises as the ratio of two appropriately scaled chi-squared variates:[7]

X = \frac{U_1/d_1}{U_2/d_2}

where

In instances where the F-distribution is used, for example in the analysis of variance, independence of U1 and U2 might be demonstrated by applying Cochran's theorem.

Equivalently, the random variable of the F-distribution may also be written

X = \frac{s_1^2}{\sigma_1^2} \;/\; \frac{s_2^2}{\sigma_2^2}

where s12 and s22 are the sums of squares S12 and S22 from two normal processes with variances σ12 and σ22 divided by the corresponding number of χ2 degrees of freedom, d1 and d2 respectively.[discuss][citation needed]

In a frequentist context, a scaled F-distribution therefore gives the probability p(s12/s22 | σ12, σ22), with the F-distribution itself, without any scaling, applying where σ12 is being taken equal to σ22. This is the context in which the F-distribution most generally appears in F-tests: where the null hypothesis is that two independent normal variances are equal, and the observed sums of some appropriately selected squares are then examined to see whether their ratio is significantly incompatible with this null hypothesis.

The quantity X has the same distribution in Bayesian statistics, if an uninformative rescaling-invariant Jeffreys prior is taken for the prior probabilities of σ12 and σ22.[8] In this context, a scaled F-distribution thus gives the posterior probability p2212|s12, s22), where now the observed sums s12 and s22 are what are taken as known.

Differential equation

Lua error in package.lua at line 80: module 'strict' not found.

The probability density function of the F-distribution is a solution of the following differential equation:

\left\{\begin{array}{l}
2 x \left(d_1 x+d_2\right) f'(x)+\left(2 d_1 x+d_2 d_1 x-d_2 d_1+2 d_2\right) f(x)=0, \\[12pt]
f(1)=\frac{d_1^{\frac{d_1}{2}} d_2^{\frac{d_2}{2}} \left(d_1+d_2\right){}^{\frac{1}{2}
   \left(-d_1-d_2\right)}}{B\left(\frac{d_1}{2},\frac{d_2}{2}\right)}
\end{array}\right\}

Related distributions and properties

X^{2} \sim \operatorname{F}(1, n)
X^{-2} \sim \operatorname{F}(n, 1)
 \tfrac{|X-\mu|}{|Y-\mu|} \sim \operatorname{F}(2,2)
  • If \operatorname{Q}_X(p) is the quantile p for X ~ F(d1, d2) and \operatorname{Q}_Y(1-p) is the quantile 1−p for Y ~ F(d2, d1), then
\operatorname{Q}_X(p)=\frac{1}{\operatorname{Q}_Y(1-p)}.

See also

<templatestyles src="Div col/styles.css"/>

References

  1. 1.0 1.1 Lua error in package.lua at line 80: module 'strict' not found.
  2. 2.0 2.1 2.2 Lua error in package.lua at line 80: module 'strict' not found.
  3. NIST (2006). Engineering Statistics Handbook – F Distribution
  4. Lua error in package.lua at line 80: module 'strict' not found.
  5. Lua error in package.lua at line 80: module 'strict' not found.
  6. Phillips, P. C. B. (1982) "The true characteristic function of the F distribution," Biometrika, 69: 261–264 JSTOR 2335882
  7. M.H. DeGroot (1986), Probability and Statistics (2nd Ed), Addison-Wesley. ISBN 0-201-11366-X, p. 500
  8. G.E.P. Box and G.C. Tiao (1973), Bayesian Inference in Statistical Analysis, Addison-Wesley. p.110

External links