Questions tagged [markov]

Markov, or markov property refers to the memoryless property of a stochastic process.

Overview

From Wikipedia,

A stochastic process has the Markov property if the conditional probability distribution of future states of the process depends only upon the present state, not on the sequence of events that preceded it.

A Markov process (Yt) can be expressed like below:

enter image description here

Tag usage

Please consider stack-exchange's Cross Validated SE for asking questions concerning statistics data analysis.

255 questions
0
votes
1 answer

proof of each row of self product of transition matrix sums to 1

I am unable to proof that the sum of each row of self product of a transition matrix is 1... Let A be a transition probability matrix which means that each row of A sums to 1, and let P=A*A. I want the prove that P is a also a valid transition…
0
votes
0 answers

show navigation in a graph in R

I am trying to show navigation in R plot. The current status in time one or (t1) is set as val$currentstatus and next status in (t2) wants to be shown in the graph based on the action that the user choose from the checkbook. then I want to draw a…
user
  • 592
  • 6
  • 26
0
votes
1 answer

Markov chain - Likelihood of sample with "unseen" observations (probability 0)

I have a large Markov chain and a sample, for which I want to calculate the likelihood. The problem is that some obervations or transitions in the sample don't occur in the Markov chain, which makes the total likelihood 0 (or the log-likelihood -…
0
votes
1 answer

For Loop error in r

I am trying to simulate a markov chain by using code that wasnt made for it. My for loop is not correct and I continue to get error messages but I kept track of the brackets and syntax but I cannot quite put my finger on the issue. The matrix for…
William Bernard
  • 359
  • 4
  • 15
0
votes
0 answers

how???subseting,conditioning, consecutive conditions, dataframe conditions on factor(binary) column(vector in r language)

i have a sequence of 1/0's indicating if patient is in remission or not, assume the records of remission or not were taken at discrete times, how can i check the markov property for each patient, then summarize the findings, that is the…
0
votes
0 answers

subseting dataframe conditions on factor(binary) column(vector in r language)

i have a sequence of 1/0's indicating if patient is in remission or not, assume the records of remission or not were taken at discrete times, how can i check the markov property for each patient, then summarize the findings, that is the…
0
votes
1 answer

RiTa + Processing + Sound

I'm interested in a processing way of achieving something similar as is made by this person link From how I get it she had a video sliced int in tiff's and then composed them with the RiTa Library Does anybody know how to achieve such a thing, only…
Blckpstv
  • 117
  • 3
  • 17
0
votes
1 answer

Repeating utility values in Value Iteration (Markov Decision Process)

I am trying to implement the value iteration algorithm of the Markov Decision Process using python. I have one implementation. But, this is giving me many repeated values for the utilities. My transition matrix is quite sparse. Probably, this is…
0
votes
1 answer

How is the optimal policy for recurrent utilities calculated?

Exam Solutions I am learning the Markov Decision Process and for Question 6 of the exam (see the link attached above), I understand how utility is calculated when the same state is obtained after an action (part a of Question 6). J*(cool) = 4 + 0.9…
0
votes
1 answer

What is the state space of this markov chain?

Consider a system where two persons sit at a table and share three books. At any point in time both are reading a book, and one book is left on the table. When a person finishes reading his/her current book, he/she swaps it with the book on the…
Undisputed007
  • 639
  • 1
  • 10
  • 31
0
votes
1 answer

Markov Regime Switching Regression Models - Time Varying Probabiliites

I am looking into estimating a markov regime switching model with time varying probs. Please help me if you know a simpler way to estimate such model.
Alaz
  • 1
  • 1
0
votes
1 answer

Error: SemiMarkov model for illness-death model

I am trying to fit a multistate model using the 'semimarkov' package in r. Below are extract of my code the result and the error I could. id state.h state.j time1 LOC sex 102 1 2 4.000000 0 0 102 2 …
mymymine
  • 49
  • 7
0
votes
2 answers

Markov C++ read from file performance

I have my 2nd assignment for C++ class which includes Markov chains. The assignment is simple but I'm not able to figure out what is the best implementation when reading chars from files. I have a file around 300k. One of the rules for the…
gogasca
  • 9,283
  • 6
  • 80
  • 125
0
votes
2 answers

Inexact power of matrix in MATLAB

As I was bored, I checked the stationary theorem regrading the transition matrix of a MARKOV chain. So I defined a simple one, e.g.: >> T=[0.5 0.5 0; 0.5 0 0.5; 0.2 0.4 0.4]; The stationary theorem says, if you calculate the transitionmatrix to a…
Tik0
  • 2,499
  • 4
  • 35
  • 50
0
votes
1 answer

Gillespie Stochastic Simulation in Discrete Time using R

I'm simulating a Stochastic Simulation for Epidemiology. How do I simulate it in a discrete time? I managed to obtain for continuous time using the coding below. library(GillespieSSA) parms <- c(beta=0.591,sigma=1/8,gamma=1/7) x0 <-…
shubha