I'm trying to implement the Forward-Algorithm for a Hidden Markov Model (HMM) and I'm facing the underflow issue when filling the alpha table. I normalized the alpha values using the method described in section 6 here but now the resulting sum of the final alpha values (probability of an observation sequence) is always equal to 1. How do I 'undo' the normalization to get the actual probability? My implementation is very similar to section 7.2 here.
There was a recent answer to this same question but I couldn't understand the last few steps and am hoping for a more detailed explanation. Thanks!
Update: I think I finally understood the recent answer but would appreciate confirmation that my understanding is correct. Here is what I did (c[k] are the coefficients):
double sum = 0.0;
for (i = 0; i < 4; i ++) { // only 4 hidden states
sum += alpha[l-1][i]; // sum last column of alpha table (normalized)
}
double sumLogC = 0.0;
for (k = 0; k < l; k++) {
sumLogC += Math.log(c[k]);
}
probability = Math.log(sum) - sumLogC;
return probability;