Questions tagged [markov-chains]

Markov chains are systems which transition from one state to another based only upon their current state. They are used widely in various statistical domains to generate sequences based upon probabilities.

Markov chains (named after their creator, Andrey Markov) are systems which transition from one state to another based only upon their current state. They are memoryless processes which are semi-random, i.e. where each state change having an associated probability.

Due to their statical nature, Markov chains are suitable for simulating complex real-life processes where probabilities are well known. They are used in a wide variety of fields, with uses too in-depth to list here; an exhaustive list can be found on the associated Wikipedia page.

In programming, they are especially popular for manipulating human languages - Markov text generators are especially popular applications of Markov chains.

577 questions
0
votes
2 answers

How to Remove all the observation for a user after he reach certain value

Hi everyone I found something similar to what I need to do but it doesn't work with my full data How to remove rows after a particular observation is seen for the first time What I need to do is to delete every observation after the client reach 4…
0
votes
2 answers

if value reach certain threshold the next one can't go under

Hi everyone im trying to obtain this: Problem I'm using R to simulate a Markov Chain and the State 4 is an absorbing state, I need that once the client has entered into the state 4 the client has to remain in that state the rest of the time. Thanks…
0
votes
1 answer

Emulate excel calculus behavior in Java for testing purposes

I'm implementing in Java an algorithm written with Excel formulas, based on Markov's chains, and comparing the results with Jxl library. The problem is, as I have read, that the two languages haven't the same precision. In fact, they return equal…
Lore
  • 1,286
  • 1
  • 22
  • 57
0
votes
1 answer

Estimating probabilities for model subset after grouping

The data used is available here (the file is called "figshare.txt"). I estimated the transition probabilities for a Markov model where the observations were grouped by location (group_by(km)). data <- data %>% group_by(km) %>%…
Blundering Ecologist
  • 1,199
  • 2
  • 14
  • 38
0
votes
1 answer

R - Streamlined Markov Chain

I have two data sets, annual transition probabilities and initial values. The goal is to use these to develop an idea of what a company will look like in five years. Initial values are in the form: | Age | Gender | Initial …
Daniel V
  • 1,305
  • 7
  • 23
0
votes
0 answers

Markov chains and Random walks on top of biological data

I'm coming from biology's field and thus I have some difficulties in understanding (intuitively?) some of the ideas of that paper. I really tried my best to decipher it step by step by using a lot of google and youtube, but now I feel, it's the time…
J. Doe
  • 619
  • 4
  • 16
0
votes
1 answer

How can I pass the AttributeError: 'DataFrame' object has no attribute 'flatten in Python?

I have used python to create spatial Markov matrix but I got this error (AttributeError: 'DataFrame' object has no attribute 'flatten' ). I am not familiar with python so, I hope to help me to solve this problem? the codes is import numpy as…
yousif
  • 23
  • 1
  • 4
0
votes
0 answers

Simulating walk around a square

I am trying to simulate a walk around a square such that the probability of walking to a vertex adjacent in the square is p/2 going left and p/2 going right, then $1-p$ going diagonally. I've written some code to simulate this and made a function to…
0
votes
0 answers

HMMLearn GaussianHMM not getting trained correctly

I have sequence of 3 observations and I have three such sequence of observations. There are three hidden states. I am using GaussianHMM of HMMlearn library. state_machine = GaussianHMM(n_components=3, covariance_type="full", n_iter=1000) …
0
votes
2 answers

Simulating a Markov Chain in R and Sequence Search

So I am working on simulating a Markov Chain with R in which the states are Sunny (S), cloudy (C) and rainy (R) and am looking to figure out the probability that a sunny day is followed by two consecutive cloudy days. Here is what I have so far: …
0
votes
1 answer

How can I use nltk to get the chance of the next word being something?

Problem I have a problem where I have one word and certain restrictions on what the second might be (for example "I _o__"). What I want is a list of words like "rode", "love", and "most" and telling me how common each one is following "I". I want to…
Riley Martine
  • 193
  • 2
  • 9
0
votes
1 answer

Ocean flows model in R/Excel (millions of data)

I am building a stochastic model to predict the movement of objects floating in the ocean. I have thousands of data from drifter buoys all around the world. In the format as below: index month year lat long 72615 10 2010 35,278 129,629 72615…
0
votes
1 answer

Histogram of MC simulation

I am trying to compare histograms of a Markov chain (MC) simulation and actual data. I have tried to run the simulation using the code below, which I don't fully understand. R seems to have accepted the code, but I don't know how to run the…
Christian
  • 3
  • 4
0
votes
2 answers

How to find the Markov Chain Probability?

I am trying to find the probability that the chain jumps from state k-1 to state 1 before it hits state k. Can anyone spot my mistake? I tried to simulate the markov chain, but i want to make a code that allows me to find probability of k ={1, 2, 3,…
Ahmed Jyad
  • 19
  • 3
0
votes
1 answer

Markov Chain Adjusting Initial State Vector to Solve For Desired Vector Element

I'm trying to increase the initial state vector in a discrete Markov chain at each step in order to solve for a state vector element at some future point time, and it seems to be quite cumbersome. For a simple example, assume a corporation has an…