Maximum likelihood
estimation
Presented By
Zihadur Rahman
20141201041
Sanath Saha
20141201037
Presented To
F.M. Rahat Hasan Robi
Lecturer
Department of Computer
Science & Engineering
Maximum likelihood estimation
 In statistics, maximum likelihood estimation is a method of estimating the
parameters of a probability distribution by maximizing a likelihood function,
so that under the assumed statistical model the observed data is most
probable.
 Maximum likelihood estimation is a method that will find the values of μ and
σ that result in the curve that best fits the data. The goal of maximum
likelihood is to find the parameter values that give the distribution that
maximise the probability of observing the data.
The concept of likelihood
 If the probability of an event X dependent on model parameters p is written
P(X I p) then we would talk about the likelihood L(p I X)
 That is, the likelihood of the parameters given the data
 The aim of maximum likelihood estimation is to find the parameter value(s)
that makes the observed data most likely
Probability Knowing parameters -> Prediction of outcome
Likelihood Observation of data -> Estimation of parameters
A simple example of Maximum likelihood
estimation
A simple example of Maximum likelihood
estimation
A simple example of Maximum likelihood
estimation
Other Practical Considerations
 Removing the constant
 Log-likelihood
Removing the constant
Log-likelihood
 The main reason for this is computational rather than theoretical
 If you multiply lots of very small numbers together (say all less than 0.0001)
then you will very quickly end up with a number that is too small to be
represented by any calculator or computer as different from zero
 This situation will often occur in calculating likelihoods, when we are often
multiplying the probabilities of lots of rare but independent events together
to calculate the joint probability
Log-likelihood

Maximum likelihood estimation

  • 1.
    Maximum likelihood estimation Presented By ZihadurRahman 20141201041 Sanath Saha 20141201037 Presented To F.M. Rahat Hasan Robi Lecturer Department of Computer Science & Engineering
  • 2.
    Maximum likelihood estimation In statistics, maximum likelihood estimation is a method of estimating the parameters of a probability distribution by maximizing a likelihood function, so that under the assumed statistical model the observed data is most probable.  Maximum likelihood estimation is a method that will find the values of μ and σ that result in the curve that best fits the data. The goal of maximum likelihood is to find the parameter values that give the distribution that maximise the probability of observing the data.
  • 3.
    The concept oflikelihood  If the probability of an event X dependent on model parameters p is written P(X I p) then we would talk about the likelihood L(p I X)  That is, the likelihood of the parameters given the data  The aim of maximum likelihood estimation is to find the parameter value(s) that makes the observed data most likely Probability Knowing parameters -> Prediction of outcome Likelihood Observation of data -> Estimation of parameters
  • 4.
    A simple exampleof Maximum likelihood estimation
  • 5.
    A simple exampleof Maximum likelihood estimation
  • 6.
    A simple exampleof Maximum likelihood estimation
  • 7.
    Other Practical Considerations Removing the constant  Log-likelihood
  • 8.
  • 9.
    Log-likelihood  The mainreason for this is computational rather than theoretical  If you multiply lots of very small numbers together (say all less than 0.0001) then you will very quickly end up with a number that is too small to be represented by any calculator or computer as different from zero  This situation will often occur in calculating likelihoods, when we are often multiplying the probabilities of lots of rare but independent events together to calculate the joint probability
  • 10.