Questions tagged [viterbi]

The Viterbi algorithm is a dynamic programming algorithm for finding the most likely sequence of hidden states—called the Viterbi path—that results in a sequence of observed events, especially in the context of Markov information sources and hidden Markov models. Use this tag for questions about this algorithm.

The Viterbi algorithm is a dynamic programming algorithm for finding the most likely sequence of hidden states—called the Viterbi path—that results in a sequence of observed events, especially in the context of Markov information sources and hidden Markov models.

The algorithm has found universal application in decoding the convolutional codes used in both CDMA and GSM digital cellular, dial-up modems, satellite, deep-space communications, and 802.11 wireless LANs. It is now also commonly used in speech recognition, speech synthesis, diarization,1 keyword spotting, computational linguistics, and bioinformatics. For example, in speech-to-text (speech recognition), the acoustic signal is treated as the observed sequence of events, and a string of text is considered to be the "hidden cause" of the acoustic signal. The Viterbi algorithm finds the most likely string of text given the acoustic signal. (src: Wikipedia)

84 questions
1
vote
0 answers

Comprehend Prolog Function, how to intermittently print results

I'm endeavouring to understand the following Prolog code: most_probable_hmm_path(Words,Path) :- probable_paths(Words,[1-[start]],PPaths), keymax(PPaths,_P-Path1), …
user2634655
1
vote
1 answer

Is this a good case for Viterbi's best path alg?

I've been working on a program that will read in OCR output, find the page numbers and then give them back to me. Any time my function finds a number it begins a sequence, it then looks on the next page for a number that is 1 greater than the…
Ben Zifkin
  • 892
  • 2
  • 8
  • 16
1
vote
0 answers

Where is my mistake in the following example about Viterbi algorithm?

I am trying to learn Hidden Markov Model, Viterbi algorithm. Therefore I was looking for an example to study. I came across a simple example from this link; Up to the position 3 I understood everything. However in position 3 when calculating A; -…
yns
  • 440
  • 2
  • 8
  • 28
1
vote
0 answers

Hidden Markov Models - Generate HMM in matlab

I have a series of values and I'm trying to model a HMM for each state. There is currently 1 state for each of the models and I have two models that I'm trying to simulate. For example: obs[0] = {1, 2, 0.9, 4.1, ..., 8.1, 9.0, .... N} obs[1] = {1,…
Phorce
  • 4,424
  • 13
  • 57
  • 107
1
vote
0 answers

Viterbi Algorithm - Score

I am working on a project that uses the leap motion controller. I am using a Hidden Markov Model to model whether a hand is moving left, or, whether a hand is moving right. For this, I do the following: 1) Take the live data from the leap motion…
Phorce
  • 2,632
  • 13
  • 43
  • 76
1
vote
1 answer

Problems in HMM toolbox

Recently I'm doing some training of HMM, I used the HMM toolbox. But I have some problems and couldn't resolve them. I train my hmm as shown below. There's no problems here. [LL, prior1, transmatrix1, observematrix1] = dhmm_em(data, prior0,…
Ri Syoutaku
  • 45
  • 1
  • 7
1
vote
2 answers

HMM Confusion with gesture recognition

I have been reading about HMM theory. From what i understand we need intial probability, transition probability and emission probability to coninue with HMM. The examples I saw about implementation of HMM define all these probabilities at start. But…
1
vote
1 answer

Understanding Viterbi Algorithm

I'm trying to implement some code from here And I have trained the HMM with my coefficients but do not understand how the Viterbi Decoder algorithm works, for example: viterbi_decode(MFCC, M, model, q); where MFCC = coefficents M = size of…
Phorce
  • 2,632
  • 13
  • 43
  • 76
1
vote
1 answer

How to visualize Viterbi path in Latex or Graphviz

I'm looking for a way to visualize a Viterbi path in LaTeX or maybe Graphviz, much like in this example: It doesn't have to be dots, but it could also be actual values between the lines. Much like a table with lines between cells. I tried searching…
fdorssers
  • 719
  • 2
  • 8
  • 19
1
vote
1 answer

Where to find viterbi algorithm transition values for natural language processing?

I just watched a video where they used Viterbi algorithm to determine whether certain words in a sentence are intended to be nouns/verbs/adjs etc, they used transition and emission probabilities, for example the probability of the word 'Time' being…
user712850
0
votes
0 answers

Is there something wrong with my Viterbi algorithm or is it an issue of underflow?

I am implementing the viterbi algorithm but it is performing poorly on POS tagging, I think there might be something wrong with my implementation inherently but my friend thinks its an issue of underflow. What do you guys think? def predict(self,…
0
votes
1 answer

Hidden Markov Model for Topical Text Segmentation

I'm attempting to write a function that splits a long document into shorter segments of text, splitting the text into the topics discussed as a step in a data processing pipeline prior to embedding the shorter segments of text for vector search. I'm…
user1583016
  • 79
  • 1
  • 11
0
votes
0 answers

GNURadio Viterbi with custom spec

I have a convolutional code with r=1/2, k=7, G1 = 1111001 and G2=1011011 (see CCSDS 131.0-B-4 basic convolutional coding, https://public.ccsds.org/Pubs/131x0b4.pdf, section 3.3.1) but I'm NOT using the symbol inversion. As far as I can tell, the GNU…
0
votes
0 answers

Is there a python equivalent to Matlab's vitdec in python

I have a question on whether here is a python implementation of the Viterbi algorithm that has the functionality of Matlab's vitdec implementation. In MATLAB the vitdec method can have a…
Steve
  • 1
  • 2
0
votes
1 answer

What is the best data structure for an emission probability table?

For my project I have a dataset of words(e.g. dog, ran, cat) and each word is tagged with a part of speech (e.g. verb, noun, adjective). I need to create a data structure which stores the totals that each word will be a certain part of speech. I am…