Questions tagged [markov-models]

Markov chain

The simplest Markov model is the Markov chain. It models the state of a system with a random variable that changes through time. In this context, the Markov property suggests that the distribution for this variable depends only on the distribution of the previous state. An example use of a Markov chain is Markov Chain Monte Carlo, which uses the Markov property to prove that a particular method for performing a random walk will sample from the joint distribution of a system.

Hidden Markov model

A hidden Markov model is a Markov chain for which the state is only partially observable. In other words, observations are related to the state of the system, but they are typically insufficient to precisely determine the state. Several well-known algorithms for hidden Markov models exist. For example, given a sequence of observations, the Viterbi algorithm will compute the most-likely corresponding sequence of states, the forward algorithm will compute the probability of the sequence of observations, and the Baum–Welch algorithm will estimate the starting probabilities, the transition function, and the observation function of a hidden Markov model. One common use is for speech recognition, where the observed data is the speech audio waveform and the hidden state is the spoken text. In this example, the Viterbi algorithm finds the most likely sequence of spoken words given the speech audio.

Markov decision process

A Markov decision process is a Markov chain in which state transitions depend on the current state and an action vector that is applied to the system. Typically, a Markov decision process is used to compute a policy of actions that will maximize some utility with respect to expected rewards. It is closely related to Reinforcement learning, and can be solved with value iteration and related methods.

Partially observable Markov decision process

A partially observable Markov decision process (POMDP) is a Markov decision process in which the state of the system is only partially observed. POMDPs are known to be NP complete, but recent approximation techniques have made them useful for a variety of applications, such as controlling simple agents or robots.

Markov random field

A Markov random field, or Markov network, may be considered to be a generalization of a Markov chain in multiple dimensions. In a Markov chain, state depends only on the previous state in time, whereas in a Markov random field, each state depends on its neighbors in any of multiple directions. A Markov random field may be visualized as a field or graph of random variables, where the distribution of each random variable depends on the neighboring variables with which it is connected. More specifically, the joint distribution for any random variable in the graph can be computed as the product of the "clique potentials" of all the cliques in the graph that contain that random variable. Modeling a problem as a Markov random field is useful because it implies that the joint distributions at each vertex in the graph may be computed in this manner.

Hierarchical Markov Models

Hierarchical Markov Models can be applied to categorize human behavior at various levels of abstraction. For example, a series of simple observations, such as a person's location in a room, can be interpreted to determine more complex information, such as in what task or activity the person is performing. Two kinds of Hierarchical Markov Models are the Hierarchical hidden Markov model and the Abstract Hidden Markov Model. Both have been used for behavior recognition, and certain conditional independence properties between different levels of abstraction in the model allow for faster learning and inference.

100 questions
0
votes
1 answer

Error: SemiMarkov model for illness-death model

I am trying to fit a multistate model using the 'semimarkov' package in r. Below are extract of my code the result and the error I could. id state.h state.j time1 LOC sex 102 1 2 4.000000 0 0 102 2 …
mymymine
  • 49
  • 7
0
votes
1 answer

Graph analysis using mcl and helper programs

I am trying to cluster data using the implementation of the Markov Clustering (mcl) algorithm at micans.org . I read in a description of the algorithm that it was possible to assign one element to several clusters. How can I do that? So far, I can…
bigTree
  • 2,103
  • 6
  • 29
  • 45
0
votes
1 answer

Graph analysis using mcxquery

I am clustering and analysing graphs using mcl. I'm not familiar with graph theory and I read about the function mcxquery. It is said in the doc that: " The main use of mcxquery is to analyze a graph at different similarity cutoffs. Typically this…
bigTree
  • 2,103
  • 6
  • 29
  • 45
0
votes
0 answers

Markov Model - Random word/gibberish generator

My code works fine until the random word generating. Sometimes it creates words/gibberish and sometimes it doesn't (probably going through an infinite loop). However, whenever it does create words/gibberish it doesn't seem so "random". The words…
0
votes
1 answer

Can I make an unchordal MRF equivalent to a chordal MRF?

Here BY equivalence I mean, will the distribution(Entire table) be made equal in both cases???
0
votes
0 answers

gmrf model for images

Can anyone explain how the parameters of the GMRF model can be estimated for an image using MATLAB? I have tried the toolboxes like UGM.(http://www.di.ens.fr/~mschmidt/Software/UGM/trainMRF.html)
Abhishek Thakur
  • 16,337
  • 15
  • 66
  • 97
-1
votes
1 answer

How to calculate the transition probability matrix of a second order Markov Chain

I have data like in form of this Broker.Position IP BP SP IP IP .. I would like to calculate the second order transition matrix like in this form BP IP SP BPBP SPSP IPIP BPSP SPBP IPSP SPIP BPIP IPBP
Rup Mitra
  • 41
  • 1
  • 2
  • 4
-2
votes
1 answer

How to train and predict using simple markov model (not "hidden markov model") in python?

I have a simple dataset that contains some columns and I need to predict using simple markov model in python. I cannot see any support under sklearn library. My dataset columns are : "url", "ip", "browser", "request". I have loaded the dataset…
-3
votes
1 answer

Aggregate result in a list

i would like to know how to create columns of all states and their corresponding time(each list correspond to an id).The Qmatrix is not important as it remains the same. ``$ :List of 3 ..$ states : num [1:3] 1 2 2 ..$ times : num [1:3] 0 5.23…
bbStudent
  • 35
  • 1
  • 4
-5
votes
1 answer

An intuitive markov networks (MRFs) tutarial?

I would like to know about basics of markov networks (MRFs). Does anyone know an intuitive tutorial about the subject. I just need to very basics information about undirected graphical models.For example how to compute the joint distribution and…
1 2 3 4 5 6
7