Questions tagged [markov]

Markov, or markov property refers to the memoryless property of a stochastic process.

Overview

From Wikipedia,

A stochastic process has the Markov property if the conditional probability distribution of future states of the process depends only upon the present state, not on the sequence of events that preceded it.

A Markov process (Yt) can be expressed like below:

enter image description here

Tag usage

Please consider stack-exchange's Cross Validated SE for asking questions concerning statistics data analysis.

255 questions
0
votes
0 answers

PyCharm error while running Markovify project

I just recently downloaded the Markovify project and I can't seem to figure out how to make it work. I've installed the markovify package, and have the project settings below: Project Settings When I run the code, I get the following error: Project…
DariushDe
  • 1
  • 2
0
votes
2 answers

Plot how the probability P(Xn=s) changes as a function of time?

Entire problem: We have a Markov chain model with with 5 states: s, t, m, f, r TPM follows: P <- matrix(c(.84,.03,.01,.03,.03, .11,.80,.15,.19,.09, .01,.04,.70,.02,.05, …
0
votes
1 answer

Source code for calculation of stationary distribution in R

Take a look at this link. I am trying to understand the following source code meant for finding stationary distribution of a matrix: # Stationary distribution of discrete-time Markov chain # (uses eigenvectors) stationary <- function(mat) { x =…
user366312
  • 16,949
  • 65
  • 235
  • 452
0
votes
1 answer

Understanding Markov Chain source code in R

The following source code is from a book. Comments are written by me to understand the code better. #================================================================== # markov(init,mat,n,states) = Simulates n steps of a Markov chain…
user366312
  • 16,949
  • 65
  • 235
  • 452
0
votes
1 answer

How to efficiently lookup state changes in a transition matrix with numpy?

I am doing some work with Markov Chains and I need to lookup the transition probability from a transition matrix given a sequence of state changes. How does one do this efficiently in numpy? For example: import numpy as np #here is the sequence…
rich
  • 520
  • 6
  • 21
0
votes
0 answers

Age-transition rate R

data=data.frame("id"=c(1,1,1,1,2,2,2,2,3,3,3,3,4,4,4,4), "grade"=c(11,11,12,13,11,12,12,11,13,14,NA,NA,12,11,13,14), "age"=c(20,21,22,23,26,27,28,29,19,20,NA,NA,22,23,24,25)) We have data on when student go from one…
bvowe
  • 3,004
  • 3
  • 16
  • 33
0
votes
1 answer

Interrogating the results of the Markov simulation - Help and feedback highly appreciated

I have built a Markov chain with which I can simulate the daily routine of people (activity patterns). Each simulation day is divided into 144-time steps and the person can carry out one of fourteen activities. Those are: Away - work (1) Away -…
0
votes
1 answer

translating python code for generating markov chains into lua

as a beginner project i try to translate this python code into lua code https://eli.thegreenplace.net/2018/elegant-python-code-for-a-markov-chain-text-generator/ . I have problems with translating pythons "random.choices" right. The markov matrix…
0
votes
0 answers

Suitable package in R for specific data analysis

I have a dataset consisting of patients and their equidistant visits and I have labelled the presence of a specific kind of mole in their left and/or right hand with {0,1} values (0 = not present and 1 = present). The dataset looks like this: …
azal
  • 1,210
  • 6
  • 23
  • 43
0
votes
0 answers

Is there a random element in Markov Chain?

I'm building a Markov Chain in R using the 'markovchain' library, I re-ran the code and always got slightly different results, despite no changes to the data going in. I tested setting the seed to try to eliminate variation from a random seed, and…
0
votes
1 answer

transition matrix for counts and proportions python

I have a matrix with the grades from a class for different years(rows for years and columns for grades). What I want is to build a transition matrix with the change between years. For instance, I want year t-1 on the y-axis and year t on the x-axis…
Enterrador99
  • 121
  • 1
  • 13
0
votes
1 answer

heemod::how to define different initial counts for two strategies when running a model

is it possible to run different initial counts for two strategies in library heemod? Let's say we have the example provided from ?run_model mod1 <- define_strategy( transition = define_transition( .5, .5, .1, .9 ), define_state( cost = 543, ly = 1…
akis
  • 154
  • 1
  • 8
0
votes
0 answers

Markov chain model in R: returning 0 conversion value

I am running a Markov chain model in R with the sample data below: There are clearly conversion values. However when I check the model output, conversion values are zeros. Do you know what happened in the process and how do I fix it? Thanks!
J Su
  • 1
  • 1
0
votes
2 answers

Use pomegranate: how to generate probabilities in Hidden Markov Model

For training a HMM model, I need start probabilities (pi), the transition probabilities, and emission probabilities. Now I want to train a HMM model with 3 states (1,2,3) and 4 outputs (a,b,c, d). The training data…
0
votes
2 answers

Hidden Markov Model how to generate probabilities

I have a lot of data from pulse\heart rate measurements, so the data is in long integer lists, and I have 8 states (although the data can range to much more than 1 to 8- it can be 50 to 140). I want an algorithm which can take the measurements data…