4

I'm trying to implement the Forward-Algorithm for a Hidden Markov Model (HMM) and I'm facing the underflow issue when filling the alpha table. I normalized the alpha values using the method described in section 6 here but now the resulting sum of the final alpha values (probability of an observation sequence) is always equal to 1. How do I 'undo' the normalization to get the actual probability? My implementation is very similar to section 7.2 here.

There was a recent answer to this same question but I couldn't understand the last few steps and am hoping for a more detailed explanation. Thanks!

Update: I think I finally understood the recent answer but would appreciate confirmation that my understanding is correct. Here is what I did (c[k] are the coefficients):

    double sum = 0.0;
    for (i = 0; i < 4; i ++) { // only 4 hidden states
        sum += alpha[l-1][i]; // sum last column of alpha table (normalized)
    }

    double sumLogC = 0.0;
    for (k = 0; k < l; k++) {
        sumLogC += Math.log(c[k]);
    }

    probability = Math.log(sum) - sumLogC;

    return probability;
Community
  • 1
  • 1
Ahmad
  • 53
  • 1
  • 4
  • 1
    According to this: http://www.cs.sjsu.edu/~stamp/RUA/HMM.pdf, the probability is just -1 * sumLogC. i.e. Take the sum of log (to the base 10) for each coefficient c(i) and finally multiply it by -1. I hope this helps. – Rohit V Apr 08 '15 at 02:01

0 Answers0