Questions tagged [markov]

Markov, or markov property refers to the memoryless property of a stochastic process.

Overview

From Wikipedia,

A stochastic process has the Markov property if the conditional probability distribution of future states of the process depends only upon the present state, not on the sequence of events that preceded it.

A Markov process (Yt) can be expressed like below:

enter image description here

Tag usage

Please consider stack-exchange's Cross Validated SE for asking questions concerning statistics data analysis.

255 questions
-1
votes
1 answer

Markov chain in R

Suppose we have a ten state system where an observation can enter the system in any one of the ten states with equal probability and move from the given state into a new state also with equal probability (the observation's new state isn't…
-1
votes
1 answer

MDP: How to calculate the chances of each possible result for a sequence of actions?

I've got a MDP problem with the following environment (3x4 map): with the possible actions Up/Down/Right/Left and a 0.8 chance of moving in the right direction, 0.1 for each adjoining direction (e.g. for Up: 0.1 chance to go Left, 0.1 chance to go…
Skyfe
  • 591
  • 1
  • 7
  • 18
-1
votes
1 answer

How to implement Markov's algorithm in Java?

I want to implement Markov's algorithm found here, but I haven't been able to. As the wiki explains it's a recursive function that replaces patterns within a language. For example "A" -> "apple" "B" -> "bag" "S" -> "shop" "T" -> "the" "the shop" ->…
Riccardo
  • 383
  • 5
  • 16
-1
votes
1 answer

How to calculate the transition probability matrix of a second order Markov Chain

I have data like in form of this Broker.Position IP BP SP IP IP .. I would like to calculate the second order transition matrix like in this form BP IP SP BPBP SPSP IPIP BPSP SPBP IPSP SPIP BPIP IPBP
Rup Mitra
  • 41
  • 1
  • 2
  • 4
-2
votes
1 answer

Stepfun function markov

Don't be scared by my long code. What i am wondering is about the last part, the plot(step fun... part. When i enter this into Rstudio i get "stepfun "x" must be ordered increasingly" Is there any1 here who knows what I have to do to finish this…
PeterNiklas
  • 75
  • 1
  • 9
-2
votes
1 answer

Birth Death Code

My question: At an institute for experimental mathematics there is a computer that helps solve problems. Problems arrive to the computer at a poisson process with intensity "Landa" per hour. The time to solve each problem can be seen as a…
PeterNiklas
  • 75
  • 1
  • 9
-2
votes
1 answer

Running Python Code

I'm completely new to Python and I am struggling to run a piece of code. I've put the code into IDLE and ran it but it returned nothing. Below I have attached the code and also the results that should accompany it. I am just wondering how you can…
user81812
  • 9
  • 1
-2
votes
1 answer

Markov Switching Model MSwM with more than 2 states

Has anyone tried running Markov Switching Model with 'MSwM' and setting more than 2 regimes? With three it does not seem to work Data (r_t) [1] 0.0000000000 -0.0101170400 -0.0016032060 -0.0071256520 0.0007075710 -0.0021212120 0.0021257210…
Lodyk Vovchak
  • 133
  • 2
  • 12
-2
votes
2 answers

Race condition in line of Python

I have an interesting problem. I am -- for shits and giggles -- trying to write a program really shortly. I have it down to 2 lines, but it has a race condition, and I can't figure out why. Here's the gist of it: imports... ...[setattr(__main__,…
hugelgupf
  • 387
  • 5
  • 13
-2
votes
2 answers

Markov-Text generating

I've been looking in to generate text. What I've learned so far is that I will have to use word-level Markov-text generation. I've found a few examples of those on this site. here Now knowing this wouldn't work I tried it anyways and copied it to…
Blckpstv
  • 117
  • 3
  • 17
-2
votes
1 answer

Markov Models and Sentence Generator project (Python)!

I am in an intro to programming class and one of our final projects is to create a sentence generator. The requirements are that we have to take a sample input, strip it down to only lower case letters, use the Markov Model to determine the…
-3
votes
1 answer

Multi state models in R2BayesX

I am trying to fit a multi-state model using R package R2BayesX. How can I do so correctly? There is no example in the manual. Here is my attempt. activity is 1/0 ie the states time is time patient id is the random effect I want f <- activity ~…
-3
votes
1 answer

Python: Create multiple dictionaries of letter transitions

So me and my groupmates are trying to make a Markov Model that finds the probability of letter transitions in a text file. In the text file we have a group of words "Steam, Teams, Meets, Teems, Eat, Ate, State, Tease, Test, Mast, Mates". In the code…
Moroth
  • 1
-3
votes
1 answer

what is the condition under which the markov chain converge?

I'm programming some program which calculates the limit of markov chain. if the markov matrix diverges, I should transform it into the form dA + (1-d)E, where both A and E are n * n matrix, and all of the elements of E are 1/n. But if I apply…
glast
  • 383
  • 1
  • 4
  • 17
-3
votes
1 answer

Markov expectation: How many stones will the hero cost?

In a game, a hero has 100% probability to go from level 0 to level 1. When at level 1, he has 1/3 probability to go to level 2, 1/3 probability to level 0, 1/3 probability to stay at level 1. When at level 2, he has 1/9 probability to win, 4/9…
xiaoming
  • 101
  • 7
1 2 3
16
17