Questions tagged [markov-chains]

Markov chains are systems which transition from one state to another based only upon their current state. They are used widely in various statistical domains to generate sequences based upon probabilities.

Markov chains (named after their creator, Andrey Markov) are systems which transition from one state to another based only upon their current state. They are memoryless processes which are semi-random, i.e. where each state change having an associated probability.

Due to their statical nature, Markov chains are suitable for simulating complex real-life processes where probabilities are well known. They are used in a wide variety of fields, with uses too in-depth to list here; an exhaustive list can be found on the associated Wikipedia page.

In programming, they are especially popular for manipulating human languages - Markov text generators are especially popular applications of Markov chains.

577 questions
0
votes
0 answers

R packageR2BayesX multi state model

Hello I would like to know how to fit a multi-state model in R package R2BayesX? I believe you have to use the bayesx function. How do I correctly do so? My data frame consists of variables such as remission, time from diagnosis, height, age, of…
0
votes
0 answers

how???subseting,conditioning, consecutive conditions, dataframe conditions on factor(binary) column(vector in r language)

i have a sequence of 1/0's indicating if patient is in remission or not, assume the records of remission or not were taken at discrete times, how can i check the markov property for each patient, then summarize the findings, that is the…
0
votes
0 answers

subseting dataframe conditions on factor(binary) column(vector in r language)

i have a sequence of 1/0's indicating if patient is in remission or not, assume the records of remission or not were taken at discrete times, how can i check the markov property for each patient, then summarize the findings, that is the…
0
votes
1 answer

Fit and evaluate a second order transition matrix (Markov Process) in R?

I am trying to build a second-order Markov Chain model, now I am try to find transition matrix from the following data. dat<-data.frame(replicate(20,sample(c("A", "B", "C","D"), size = 100, replace=TRUE))) Now I know how to fit the first order…
uared1776
  • 67
  • 10
0
votes
1 answer

Random permutation of numbers in MATLAB with weights

How to randomize the numbers in a vector a, with weights assigned in such a way that I can control what numbers 'follow' other numbers? Let's say: a = [ 1 2 3 4] I would like to obtain something like this: 1 2 1 3 4 2 1 4 3 4 1 3 4 1 .... My aim…
0
votes
1 answer

Generating a Markov transition matrix with known, but stochastic, state times

I have looked for an answer for a while, but with no luck. I am trying to develop a discrete time Markov model. Presently, I have 5 states, with the 5th state being the absorbing state. I also know the variable time durations that each state…
Simon Bush
  • 407
  • 2
  • 9
0
votes
1 answer

How do I program bigram as a table in python?

I'm doing this homework, and I am stuck at this point. I can't program Bigram frequency in the English language, 'conditional probability' in python? That is, the probability of a token given the preceding token is equal to the probability of…
py.codan
  • 89
  • 1
  • 11
0
votes
1 answer

Text file probability calculation (Markov Chain) - Python

I'm in a bad situation. I need to program something, that's a level over my capacity. I have been given a text with 10k words, the file is called (test_file.txt). My question to u guys are: How do I get my program to count every single words, and…
Toni
  • 75
  • 1
  • 1
  • 7
0
votes
0 answers

R msm totlos producing zeros

I asked this on crossvalidated and someone suggested it may be a software question I'm working on modeling an 8 state multi-state markov model using the msm package for R. I was able to generate a fitted model for the data, however I am running into…
Mark C
  • 427
  • 2
  • 7
  • 16
0
votes
1 answer

R msm package freezing

I've been trying to use the msm package to model an 8 state, multi-state markov chain. My data set, in total, contains about 11,000 subjects, with slightly over 100k observations total. I try to run the msm function on several subsets of the data,…
Mark C
  • 427
  • 2
  • 7
  • 16
0
votes
1 answer

What is the state space of this markov chain?

Consider a system where two persons sit at a table and share three books. At any point in time both are reading a book, and one book is left on the table. When a person finishes reading his/her current book, he/she swaps it with the book on the…
Undisputed007
  • 639
  • 1
  • 10
  • 31
0
votes
0 answers

For loop issues for a Markov chain Monte Carlo

So here is my next problem. I am trying to to loop through and find out how many of the entries in State_Space have a 1 as their 25th entry yet it keeps telling me that the answer is 0. Here is the code. import random import…
0
votes
1 answer

Select one or multiple random SQL rows with a WHERE condition on a large table

I've read http://explainextended.com/2009/03/01/selecting-random-rows/ that was also suggested as answer to other questions about selecting a random row from a large table. However, I now wonder how this technique can be combined with selecting only…
Qqwy
  • 5,214
  • 5
  • 42
  • 83
0
votes
2 answers

How can I grab a specified number of words with special characters in them with RegExp?

I'm currently working with a Markov chain text generator application in Ruby that takes in a body ("corpus") of text and then generates new text based off of that. The problem I need to solve currently is writing a Regexp that will return arrays…
Way Spurr-Chen
  • 405
  • 2
  • 9
0
votes
0 answers

(Sequential) weighted sampling

I need to figure out the total path (A to Z) followed by an agent through a squared-element grid. Each grid element has a probability density function Delta assigned to it that represents the probable direction (0-360) that the agent might take when…
Oliver Amundsen
  • 1,491
  • 2
  • 21
  • 40