Questions tagged [markov-chains]

Markov chains are systems which transition from one state to another based only upon their current state. They are used widely in various statistical domains to generate sequences based upon probabilities.

Markov chains (named after their creator, Andrey Markov) are systems which transition from one state to another based only upon their current state. They are memoryless processes which are semi-random, i.e. where each state change having an associated probability.

Due to their statical nature, Markov chains are suitable for simulating complex real-life processes where probabilities are well known. They are used in a wide variety of fields, with uses too in-depth to list here; an exhaustive list can be found on the associated Wikipedia page.

In programming, they are especially popular for manipulating human languages - Markov text generators are especially popular applications of Markov chains.

577 questions
5
votes
2 answers

Markov Chains and decimal points in r?

I have plotted a markov chain from a matrix in r. However, I have numerous probabilities under 0.01, and thus my probability plot looks something like this: I've been searching for hours and I can't seem to find something that allows me to display…
Megan
  • 61
  • 1
5
votes
1 answer

Rewriting a pymc script for parameter estimation in dynamical systems in pymc3

I'd like to use pymc3 to estimate unknown parameters and states in a Hodgkin Huxley neuron model. My code in pymc is based off of…
5
votes
1 answer

Creating a smart text generator

I'm doing this for fun (or as 4chan says "for teh lolz") and if I learn something on the way all the better. I took an AI course almost 2 years ago now and I really enjoyed it but I managed to forget everything so this is a way to refresh…
encee
  • 4,544
  • 4
  • 33
  • 35
5
votes
4 answers

Markov chain stationary distributions with scipy.sparse?

I have a Markov chain given as a large sparse scipy matrix A. (I've constructed the matrix in scipy.sparse.dok_matrix format, but converting to other ones or constructing it as csc_matrix are fine.) I'd like to know any stationary distribution p of…
Anaphory
  • 6,045
  • 4
  • 37
  • 68
5
votes
2 answers

How do Markov Chains work and what is memorylessness?

How do Markov Chains work? I have read wikipedia for Markov Chain, But the thing I don't get is memorylessness. Memorylessness states that: The next state depends only on the current state and not on the sequence of events that preceded it. If…
unknown
  • 4,859
  • 10
  • 44
  • 62
5
votes
1 answer

Difference Between J48 and Markov Chains

I am attempting to do some evaluation of the relative rates of different algorithms in the C# and F# realms using WekaSharp and one of the algorithms I was interested in was Markov Chains. I know Weka has an HMM application but I have not been able…
5
votes
2 answers

Estimating confidence intervals of a Markov transition matrix

I have a series of n=400 sequences of varying length containing the letters ACGTE. For example, the probability of having C after A is: and which can be calculated from the set of empirical sequences, thus Assuming: Then I get a transition…
HCAI
  • 2,213
  • 8
  • 33
  • 65
5
votes
3 answers

Convert text prediction script [Markov Chain] from javascript to python

i've been trying the last couple days to convert this js script to python code. My implementation (blindfull cp mostly, some minor fixes here and there) so far: import random class markov: memory = {} separator = ' ' order = 2 def…
Lopofsky
  • 518
  • 6
  • 15
4
votes
2 answers

Markov chain on letter scale and random text

I would like to generate a random text using letter frequencies from a book in a .txt file, so that each new character (string.lowercase + ' ') depends on the previous one. How do I use Markov chains to do so? Or is it simpler to use 27 arrays with…
Julia
  • 1,369
  • 4
  • 18
  • 38
4
votes
2 answers

Markov Chain: Find the most probable path from point A to point B

I have a transition matrix using dictionary {'hex1': {'hex2': 1.0}, 'hex2': {'hex4': 0.4, 'hex7': 0.2, 'hex6': 0.2, 'hex1': 0.2}, 'hex4': {'hex3': 1.0}, 'hex3': {'hex6': 0.3333333333333333, 'hex2': 0.6666666666666666}, 'hex6': {'hex1':…
Regalia9363
  • 342
  • 2
  • 14
4
votes
3 answers

Multiplication of many matrices in R

I want to multiply several matrices of the same size with an inital vector. In the example below p.state is vector of m elements and tran.mat is list where each member is an m x m matrix. for (i in 1:length(tran.mat)){ p.state <- p.state %*%…
4
votes
1 answer

Using Markov Chains for time series data

Here is how my data frame is currently structured (first 6 rows). The data I used is available here. ID date sps time pp datetime km 1 2012-06-19 MICRO 2:19 0 2012-06-19 02:19 80 2 2012-06-21 MUXX 23:23 …
Blundering Ecologist
  • 1,199
  • 2
  • 14
  • 38
4
votes
3 answers

creating a transitions matrix for markov Model

I need help on a topic related to markov chains and preprocessing of data. Suppose I have the following matrix relating individuals to states over time: ID Time1 Time2 1 14021 A A 2 15031 B A 3 16452 A C I would like to…
Arrebimbomalho
  • 176
  • 1
  • 12
4
votes
0 answers

How to evaluate Markov Model accuracy

I have created the following Markov chain Model. And I am struggling to prove mathematically that my model works correctly, or doesn't work. Sequence: Start, state1, state2, state3, state3, state2, state1, state2, state1, end States: start, state1,…
4
votes
0 answers

Markov chain graphics in igraph

I don't understand why the transition state labels are incorrect when I plot my markov chain as an igraph object. The labels are correct when plotted through the markovchain package. Probably something simple I am doing wrong. Example…
mdsailing
  • 123
  • 3