[Probability] EM Algorithm

EM Algorithm

  • For incomplete data case of Maximum Likelihood
  • Auxiliary function
    • No learning rate
    • Monotonic convergence (each iteration improves L)
  • Comparison with other numerical methods
    • Gradient Descent: Asymptotic but not monotonic convergence
    • Newton’s Method: Monotonic convergence, but highly unstable

Problem

Insight

We can find an auxiliary function that guarantees monotonic convergence

Derivation

Algorithm

  1. Initialize CPT to any value (non-zero)
  2. Repeat following until convergence

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.