Contributions I have made to Wikipedia

One-Dimensional Examples of Sufficient Statistics edit

Normal Distribution edit

If   are independent and normally distributed with expected value θ (a parameter) and known finite variance  , then   is a sufficient statistic for θ.

To see this, consider the joint probability density function of  . Because the observations are independent, the pdf can be written as a product of individual densities, ie -

 

Then, since  , which can be shown simply by expanding this term,

 

The joint density of the sample takes the form required by the Fisher–Neyman factorization theorem, by letting

 

Since   does not depend on the parameter   and   depends only on   through the function  

the Fisher–Neyman factorization theorem implies   is a sufficient statistic for  .

Exponential Distribution edit

If   are independent and exponentially distributed with expected value θ (an unknown real-valued positive parameter), then   is a sufficient statistic for θ.

To see this, consider the joint probability density function of  . Because the observations are independent, the pdf can be written as a product of individual densities, ie -

 

The joint density of the sample takes the form required by the Fisher–Neyman factorization theorem, by letting

 

Since   does not depend on the parameter   and   depends only on   through the function  

the Fisher–Neyman factorization theorem implies   is a sufficient statistic for  .


Two-Dimensional Examples of Sufficient Statistics edit

Uniform Distribution (with two parameters) edit

If   are independent and uniformly distributed on the interval   (where   and   are unknown parameters), then   is a two-dimensional sufficient statistic for  .

To see this, consider the joint probability density function of  . Because the observations are independent, the pdf can be written as a product of individual densities, ie -

 

The joint density of the sample takes the form required by the Fisher–Neyman factorization theorem, by letting

 

Since   does not depend on the parameter   and   depends only on   through the function  ,

the Fisher–Neyman factorization theorem implies   is a sufficient statistic for  .

Gamma Distribution edit

If   are independent and distributed as a   , where   and   are unknown parameters of a Gamma distribution, then   is a two-dimensional sufficient statistic for  .

To see this, consider the joint probability density function of  . Because the observations are independent, the pdf can be written as a product of individual densities, ie -

 

The joint density of the sample takes the form required by the Fisher–Neyman factorization theorem, by letting

 

Since   does not depend on the parameter   and   depends only on   through the function  ,

the Fisher–Neyman factorization theorem implies   is a sufficient statistic for  .