language-icon Old Web
English
Sign In

Likelihood function

In statistics, the likelihood function (often simply called likelihood) expresses how probable a set of observations are given particular values of statistical parameters. The likelihood of a set of parameters, given a set of observations following some probability distribution, is equal to the joint probability distribution of this random sample evaluated at the parameters. Treating the observations as fixed, the likelihood function is solely a function of parameters that index the family of those probability distributions.n 1922, I proposed the term ‘likelihood,’ in view of the fact that, with respect to , it is not a probability, and does not obey the laws of probability, while at the same time it bears to the problem of rational choice among the possible values of a relation similar to that which probability bears to the problem of predicting events in games of chance. . . .Whereas, however, in relation to psychological judgment, likelihood has some resemblance to probability, the two concepts are wholly distinct. . . .” In statistics, the likelihood function (often simply called likelihood) expresses how probable a set of observations are given particular values of statistical parameters. The likelihood of a set of parameters, given a set of observations following some probability distribution, is equal to the joint probability distribution of this random sample evaluated at the parameters. Treating the observations as fixed, the likelihood function is solely a function of parameters that index the family of those probability distributions. Mapping from the parameter space to the real line, the likelihood function describes a hypersurface whose peak, if it exists, represents the combination of model parameter values that maximize the probability of drawing the sample actually obtained. The procedure for obtaining these arguments of the maximum of the likelihood function is known as maximum likelihood estimation, which for computational convenience is usually done using the natural logarithm of the likelihood, known as the log-likelihood function. Additionally, the shape and curvature of the likelihood surface represent information about the stability of the estimates, which is why the likelihood function is often plotted as part of a statistical analysis. The case for using likelihood was first made by R. A. Fisher, who believed it to be a self-contained framework for statistical modelling and inference. Later, Barnard and Birnbaum led a school of thought that advocated the likelihood principle, postulating that all relevant information for inference is contained in the likelihood function. But even in frequentist and Bayesian statistics, the likelihood function plays a fundamental role.

[ "Maximum likelihood", "Estimation theory", "Minimum chi-square estimation", "Marginal likelihood", "Quasi-maximum likelihood", "Maximum spacing estimation", "Likelihood principle" ]
Parent Topic
Child Topic
    No Parent Topic