Notations

States are hidden, observations are noisy and just partial reflections of the states.
Computations
Inference
– How to compute likelihood
– How to compute most likely (hidden) state sequence
Learning
– How to estimate parameters that maximize (multiple) observation sequences
Computing likelihood
Goal: computing likelihood of a state at time t+1 given all observations to time t+1
Fill up table of alpha [nxT]


Inference in HMMs
Goal: computing the most likely path of the states given observation to time T
Keyword: Viterbi path, Viterbi algorithm
Fill up table of l [nxT], which is the log probability of most likely t-step “path” that ends in state i at time t.

Start with the base case at t=1, find relationship between time t and time t+1 and recurse to T.
Base case for maximum log probability

Recursion for maximum log probability

Computing Viterbi path
Most likely state at time t given stat j at time t+1 with observations to time t.

Recursion gives us the Viterbi path

Learning in HMMs
Goal: given sequence of observations, estimate parameters (initial state, transition matrix, emission matrix) to maximize the likelihood.
Keyword: “forward-backward” algorithm (Baum-Welch)

Time complexity of this process is O(n^2T)
Gaussian Mixture Model (GMM)
Easy to understand ML estimation of Gaussian Mixture Model by making an analogy with K-means clustering, and EM algorithm is just an extension of it to cover the incomplete data case.

Belief Updating
Recursion relation for real-time updating of beliefs based on incoming evidence. Very useful for agents that must monitor their environments in real-time
Keyword: Kalman filtering
