Scoring algorithm, also known as Fisher's scoring,[1] is a form of Newton's method used in statistics to solve maximum likelihood equations numerically, named after Ronald Fisher.

Sketch of derivation edit

Let   be random variables, independent and identically distributed with twice differentiable p.d.f.  , and we wish to calculate the maximum likelihood estimator (M.L.E.)   of  . First, suppose we have a starting point for our algorithm  , and consider a Taylor expansion of the score function,  , about  :

 

where

 

is the observed information matrix at  . Now, setting  , using that   and rearranging gives us:

 

We therefore use the algorithm

 

and under certain regularity conditions, it can be shown that  .

Fisher scoring edit

In practice,   is usually replaced by  , the Fisher information, thus giving us the Fisher Scoring Algorithm:

 ..

Under some regularity conditions, if   is a consistent estimator, then   (the correction after a single step) is 'optimal' in the sense that its error distribution is asymptotically identical to that of the true max-likelihood estimate.[2]

See also edit

References edit

  1. ^ Longford, Nicholas T. (1987). "A fast scoring algorithm for maximum likelihood estimation in unbalanced mixed models with nested random effects". Biometrika. 74 (4): 817–827. doi:10.1093/biomet/74.4.817.
  2. ^ Li, Bing; Babu, G. Jogesh (2019), "Bayesian Inference", Springer Texts in Statistics, New York, NY: Springer New York, Theorem 9.4, doi:10.1007/978-1-4939-9761-9_6, ISBN 978-1-4939-9759-6, S2CID 239322258, retrieved 2023-01-03

Further reading edit