Computational Molecular Biology, aka Algorithms for Computational Biology

Hidden Markov Model Module Guide

This module is intended to teach essentially everything there is to know about the most basic type of hidden Markov model (HMM).  Specifically, you should be able to:

  1. Determine the strengths and weaknesses of an HMM as a model of a given situation.
  2. For a given situation, sketch the state diagram of an HMM model and describe the characteristics of reasonable parameter values.
  3. Implement the Viterbi algorithm for HMM decoding.
  4. Implement posterior decoding.
  5. Implement the Baum-Welch algorithm for unsupervised parameter estimation.
  6. Read, understand, and correct errors in correctness proofs for the above algorithms.
  7. Explain limitations of HMMs and associated algorithms and reference methods for enhancing them.

The source for the material in this module is largely the first few sections of “A tutorial on Hidden Markov Models…” by Rabiner. HMMs are also covered, using a different notation in Chapter 3 of Biological Sequence Analysis, by Durbin et al. (see Notation guide). Relevant sections of both are worth reading. Durbin & Eddy is a good (and inexpensive) book to have on your shelf.

There are optional online lectures in mathematicalmonk's youtube series on machine learning, chapter 14 (ML 14.1 through ML 14.12). He starts with Markov models and Markov chains before getting to HMMs. He tends to talk really slow so I listen to them at 2X playback speed.

Day 0
In class

Before the next class Day 1
In class
Before the next class Before class on Day 4
Day 2
In class
Before the next class
Day 3
In class

Before the next class


Day 4
In class Before class on Day 6
Day 5
In class

Before the next class


Day 6
In class Before the next class