Questions tagged [markov]

Markov, or markov property refers to the memoryless property of a stochastic process.

Overview

From Wikipedia,

A stochastic process has the Markov property if the conditional probability distribution of future states of the process depends only upon the present state, not on the sequence of events that preceded it.

A Markov process (Yt) can be expressed like below:

enter image description here

Tag usage

Please consider stack-exchange's Cross Validated SE for asking questions concerning statistics data analysis.

255 questions
0
votes
0 answers

msmFit: Fitting Markov Switching Models - Results differ almost every time

I am very new here and am writing my first post. I hope you will bear with me. I am currently using the msmFit(object, k, sw, p, data, family, control) command in R studio to set up a markov regime switching process. I am not an econometrician, but…
Cook04
  • 1
  • 1
0
votes
0 answers

R msm package does not generate estimates

I am trying to use the MSM R-package to estimate a continuous-time hidden Markov model. I do not know why my code does not show the estimates and confidence intervals for the transition intensities and the hidden Markov model parameters when I print…
user13416
  • 1
  • 1
0
votes
1 answer

Markov chain plot. How to hide probabilities in the plot

I have a huge matrix and I'm trying to visualize it better. I only found edge.arrow.size, vertex.size and layout to include in the details of the plot, but it still is hard to visualize
0
votes
0 answers

MCMC code is very slow for even small steps

I have a problem with Python. My question is not about any problem in writing code. I have a script that has been used many times before for my calculations and my published papers. It is MCMC or Markov Chain Montecarlo (EMCEE python package). For…
Ethan
  • 180
  • 2
  • 15
0
votes
1 answer

Why Gt+1 = v(St+1) in Bellman Equation for MRPs?

In by David Silver on page 19, it has the following Derived formula: I found is equal to which means Gt+1 = v(St+1) so Gt = v(St). According to Return Defination: and according to Gt = v(St): v(St) = Gt =…
Martin
  • 116
  • 10
0
votes
0 answers

Transition probability in a non-homogenous continuous time Markov model along a given fixed path of states, say s1->s2->s3?

In a non-homogenous, continuous time Markov model, the Nelson-Aalen estimator of the transition matrix P(s,t) estimates the transition probability in the time interval [s,t] from any state s1 to any other state s2. Is it possible to estimate the…
Wolfgang123
  • 101
  • 1
0
votes
1 answer

Creating 1 step transition matrix, find probability that someone moves to a particular city

I'm looking for a way to find the transition matrix (in R) with probabilities where someone moves. This is how my df looks: City_year1 City_year2 1 Alphen aan den Rijn NA 2 Tynaarlo NA …
0
votes
1 answer

depmix function to fit two state gamma distribution

I am using depmixS4 package in R. I have a data that looks like a gamma distribution, and I am assuming that there are two states. I would like to fit two-state gamma distribution to my data in R. The following is my code: mod <- depmix(freq ~ 1,…
Yun Hwang
  • 1
  • 1
0
votes
0 answers

Generating a DNA sequence from Conditional Probablities

I have been working on my bio-tech project and I have been stuck on this for a long. Idea - Generating DNA sequences from a set of probabilities. - For a sample, I took a given DNA string of length 128 and figured out conditional probabilities -…
anony_std
  • 29
  • 6
0
votes
0 answers

Multiplying a matrix by itself in C++

I am trying to simulate Markov chain transitions by multiplying a 2x2 matrix (the transition matrix) by a 2x1 matrix in C++, and then taking that output as a 2x1 matrix and then using it again in a multiplication, repeated again up to a set number…
rwbc1601
  • 15
  • 4
0
votes
0 answers

How to download package "markovchain" for R version less than 3.6

I have R version of 3.4.4 loaded on my laptop but I want to download package "markovchain" in my R. THE CODE I USED WAS install.packages("markovchain", dependencies=TRUE, repos='http://cran.rstudio.com/') BUT I RECEIVED THE FOLLOWING ERROR Warning…
0
votes
0 answers

Estimation transition matrix with low observation count

I am building a markov model with an relativ low count of observations for a given number of states. Are there other methods to estimate the real transition probabilities than the cohort method? Especially to ensure that the probabilities are…
SchmiPi
  • 7
  • 1
0
votes
3 answers

'pychattr' library in Python, 'n_simulations' parameter

Does anyone know if it is possible to use n_simulation = None in 'MarkovModel' algorithm in 'pychhatr' library in Python? It throws me an error it must be an integer, but in docsting i have information like that: 'n_simulations : one of {int, None};…
0
votes
1 answer

facing problems in sentiment analysis in python

I am facing problem in performing Morkov model import markovify import sys # Read text from file if len(sys.argv) != 2: sys.exit("Usage: python generator.py sample.txt") with open(sys.argv[1]) as f: text = f.read() # Train model text_model…
user103987
  • 65
  • 2
  • 9
0
votes
1 answer

In python, is there a way to remove all text following the last instance of a delimiter?

I'm trying to create a random text generator in python. I'm using Markovify to produce the required text, a filter to not let it start generating text unless the first word is capitalized and, to prevent it from ending "mid sentence", want the…