2. Maximum likelihood estimation#
2.1. Reading materials#
2.2. Definition#
In statistics,
maximum likelihood estimation (MLE)
is a method of estimating the parameters of an assumed probability distribution, given some observed data.Therefore, we have to make an assumption of distribution at first. Take the
normal distribution
as an example, we assumeMean(average)
have the highest probabilityRelatively symmetrical around the mean (no skewness)
2.3. Steps in MLE#
Write the likelihood function _ where
is the observed value _ is the parameter from assumed distribution * is the probability functionGet the logarithm of likelihood function
The goal is to maximize the likelihood function, but likelihood function is
product
of bunch probabilities, which makes harder to calculate derivatives.logarithm of likelihood
function will not change the maximum and minimum position, and also transfer products to summation
Get partial derivatives on distribution parameter θ
Get the solution of above equation, then find which solution makes $ln(L)$ get maximum
If there is no solution (no flat point), means
or gives the maximum
2.4. MLE for normal distribution#
2.4.1. Understand the parameter of normal distribution#
What the probability of observing
from a normal distribution ~
2.4.2. Get logarithm of likelihood function#
Sum the logarithm of likelihood function for all observations $
$
2.4.3. Estimate and #
Get partial derivatives of
Let partial derivatives equal to 0 to get the solution
Get partial derivatives of
Let partial derivatives equal to 0 to get the solution