Maximum Likelihood Estimation
This is a general strategy for estimating the parameters of a statistical model of the process that generated a set of data. The parameter values are chosen to maximize the likelihood of the observed data. For example, in regression models, the parameters are the coefficients that are multiplied by the observed values of the predictor variables to generate a fitted value of the outcome. The method can be thought of as choosing the parameters that are most compatible with, or most supported by, the observed data. Regression models are usually estimated by maximum likelihood, including linear, logistic, Poisson, and negative binomial regression. Parametric survival models also often use maximum likelihood, while the Cox proportional hazards model is fitted by maximizing a “partial likelihood” that is conditional on both the model and the number of failures observed at each time.