Observed information

From Wikipedia, the free encyclopedia

In statistics, the observed information, or observed Fisher information, is the negative of the second derivative (the Hessian matrix) of the "log-likelihood" (the logarithm of the likelihood function). It is a sample-based version of the Fisher information.

Definition[]

Suppose we observe random variables , independent and identically distributed with density f(X; θ), where θ is a (possibly unknown) vector. Then the log-likelihood of the parameters given the data is

.

We define the observed information matrix at as

In many instances, the observed information is evaluated at the maximum-likelihood estimate.[1]

Alternative definition[]

Andrew Gelman, David Dunson and Donald Rubin[2] define observed information instead in terms of the parameters' posterior probability, :

Fisher information[]

The Fisher information is the expected value of the observed information given a single observation distributed according to the hypothetical model with parameter :

.

Applications[]

In a notable article, Bradley Efron and David V. Hinkley[3] argued that the observed information should be used in preference to the when employing normal approximations for the distribution of maximum-likelihood estimates.

See also[]

References[]

  1. ^ Dodge, Y. (2003) The Oxford Dictionary of Statistical Terms, OUP. ISBN 0-19-920613-9
  2. ^ Gelman, Andrew; Carlin, John; Stern, Hal; Dunson, David; Vehtari, Aki; Rubin, Donald (2014). Bayesian Data Analysis (3rd ed.). p. 84.
  3. ^ Efron, B.; Hinkley, D.V. (1978). "Assessing the accuracy of the maximum likelihood estimator: Observed versus expected Fisher Information". Biometrika. 65 (3): 457–487. doi:10.1093/biomet/65.3.457. JSTOR 2335893. MR 0521817.
Retrieved from ""