Questions tagged [markov-models]

Markov chain

The simplest Markov model is the Markov chain. It models the state of a system with a random variable that changes through time. In this context, the Markov property suggests that the distribution for this variable depends only on the distribution of the previous state. An example use of a Markov chain is Markov Chain Monte Carlo, which uses the Markov property to prove that a particular method for performing a random walk will sample from the joint distribution of a system.

Hidden Markov model

A hidden Markov model is a Markov chain for which the state is only partially observable. In other words, observations are related to the state of the system, but they are typically insufficient to precisely determine the state. Several well-known algorithms for hidden Markov models exist. For example, given a sequence of observations, the Viterbi algorithm will compute the most-likely corresponding sequence of states, the forward algorithm will compute the probability of the sequence of observations, and the Baum–Welch algorithm will estimate the starting probabilities, the transition function, and the observation function of a hidden Markov model. One common use is for speech recognition, where the observed data is the speech audio waveform and the hidden state is the spoken text. In this example, the Viterbi algorithm finds the most likely sequence of spoken words given the speech audio.

Markov decision process

A Markov decision process is a Markov chain in which state transitions depend on the current state and an action vector that is applied to the system. Typically, a Markov decision process is used to compute a policy of actions that will maximize some utility with respect to expected rewards. It is closely related to Reinforcement learning, and can be solved with value iteration and related methods.

Partially observable Markov decision process

A partially observable Markov decision process (POMDP) is a Markov decision process in which the state of the system is only partially observed. POMDPs are known to be NP complete, but recent approximation techniques have made them useful for a variety of applications, such as controlling simple agents or robots.

Markov random field

A Markov random field, or Markov network, may be considered to be a generalization of a Markov chain in multiple dimensions. In a Markov chain, state depends only on the previous state in time, whereas in a Markov random field, each state depends on its neighbors in any of multiple directions. A Markov random field may be visualized as a field or graph of random variables, where the distribution of each random variable depends on the neighboring variables with which it is connected. More specifically, the joint distribution for any random variable in the graph can be computed as the product of the "clique potentials" of all the cliques in the graph that contain that random variable. Modeling a problem as a Markov random field is useful because it implies that the joint distributions at each vertex in the graph may be computed in this manner.

Hierarchical Markov Models

Hierarchical Markov Models can be applied to categorize human behavior at various levels of abstraction. For example, a series of simple observations, such as a person's location in a room, can be interpreted to determine more complex information, such as in what task or activity the person is performing. Two kinds of Hierarchical Markov Models are the Hierarchical hidden Markov model and the Abstract Hidden Markov Model. Both have been used for behavior recognition, and certain conditional independence properties between different levels of abstraction in the model allow for faster learning and inference.

100 questions
0
votes
0 answers

Adjusting for Hierarchical Clustering in Markov Model

I'm interested in using a discrete-time non-homogeneous Markov Model for one of my studies. The data I am using has Hierarchical clustering (ex. patients within hospitals). I was wondering if anyone knows how to account for clustering in such a…
0
votes
0 answers

Select specific AR lags in statsmodels MarkovAutoregression

I would like to estimate a Markov Switching model using only the 24th and 25th lag with the MarkovAutoregression function from statsmodels. Setting order=24 seems to include every intermediate lag. Is there a way to only include only some specific…
0
votes
0 answers

Statsmodels Markov-Switching model error: ValueError - The given frequency argument could not be matched to the given index

I am trying to estimate a Markov-Switching model using statsmodels just like the example provided at https://www.statsmodels.org/devel/examples/notebooks/generated/markov_autoregression.html . I have daily EUR/USD returns with a total of 874…
0
votes
0 answers

Markov multi-state models to longitudinal data

I am trying to use the MSM package to model transition rates. I have data at two time periods: baseline and then at follow-up. The participants could be in any of the 3 stages at baseline and then either remain in the same stage, transition forward…
0
votes
0 answers

LMest: failed estimation of transition probability matrices when Latent Markov Modeling continuous data in R

I am working with longitudinal continuous data that reflect the linguistic abilities of children. In that regard I seek to make a Latent Transition Model, more exact a Latent Markov Model using the LMest package in R. I have successeed in doing so,…
0
votes
0 answers

How to evaluate robustness of Markov Chain attribution model?

I decided to use the Markov chain attribution model to determine the value of each digital channels on final conversions. In R I used the function markov_model (of the package ChannelAttribution by Davide Altomare) applied to the dataset PathData…
0
votes
0 answers

Is there a way to make a recursive out-of-sample with Markov Switching model?

I am trying to forecast a GDP with Markov Switching model. I used the "MSwM" package to fit the univariate AR(1) model. Now the problem is, how to forecast MS model?
0
votes
0 answers

Analyzing Clickstream Data using Markov models in R

I am working on clickstream data that does not have an end-state. For instance, some clickstream datasets have a user's journey, ending with a purchase or not. My data does not have an end-state and it is not required to have one. So, I have mapped…
user2845095
  • 465
  • 2
  • 9
0
votes
1 answer

R MSM package: the Q matrix is the same for different covariate values, even though transition rates differ

I am fitting a continuous-time Markov model to a panel dataset using the R package MSM. Because I am interested in sex-differences in transition rates, I fit the model with covariate sex ("M" or "F") by running model_object <- msm( formula = state…
Stijn
  • 551
  • 4
  • 17
0
votes
1 answer

How can I build a Markov Model for text?

I am just getting to learn the implementation of Markov's model, and I am trying to build a code that automatically predicts the word that precedes a particular word. I want to use this to generate a 100-word composition using this random words(I…
0
votes
1 answer

How to update the hmmlearn learned object when we have new samples?

I have implemented a simple code for Hidden Markov Model by hmmlearn and it is working well. I used fit() method, i.e. hmmlearn.fit to learn the hmm parameter based on my data. If I have more data and want to update previously fitted model without…
0
votes
0 answers

Estimation transition matrix with low observation count

I am building a markov model with an relativ low count of observations for a given number of states. Are there other methods to estimate the real transition probabilities than the cohort method? Especially to ensure that the probabilities are…
SchmiPi
  • 7
  • 1
0
votes
0 answers

Covariate dependent Markov models? Plot state transition probability along gradient of covariate values

Data consists of 4 variable, id, x1 and x2, continuous variables which are correlated with y, a binary variable. 0 and 1 in the binary variable represent different states. Is it possible to use Markov chain models to calculate and plot state…
procerus
  • 194
  • 8
0
votes
0 answers

Markov Switching Regression in R not working

I have price data for an asset. I want to fit a Markow Switching model (with 2 states). The code I have run is below. Price is configured as numeric and date as a date. Not sure where I'm going wrong. library(MSwM) # Loading required package:…
JHolmes
  • 23
  • 6
0
votes
1 answer

Error import with 'shamilton_filter_log' from 'statsmodels.tsa.regime_switching._hamilton_filter'

I tried to compile MarkovSwitching.py from statsmodels (link description here) in python, but I have the follwoing error, ImportError: cannot import name 'shamilton_filter_log' from 'statsmodels.tsa.regime_switching._hamilton_filter' …
Pablo_
  • 101
  • 1