Questions tagged [markov-chains]

Markov chains are systems which transition from one state to another based only upon their current state. They are used widely in various statistical domains to generate sequences based upon probabilities.

Markov chains (named after their creator, Andrey Markov) are systems which transition from one state to another based only upon their current state. They are memoryless processes which are semi-random, i.e. where each state change having an associated probability.

Due to their statical nature, Markov chains are suitable for simulating complex real-life processes where probabilities are well known. They are used in a wide variety of fields, with uses too in-depth to list here; an exhaustive list can be found on the associated Wikipedia page.

In programming, they are especially popular for manipulating human languages - Markov text generators are especially popular applications of Markov chains.

577 questions
0
votes
1 answer

markov chain python to processing

I made a python project based on markov chains to create sentences. Now I have to make the same thing but in processing. Here is the python code I need help with: def createProbabilityHash(words): def createProbabilityHash(words): numWords =…
0
votes
1 answer

DTMC Markov Chain - How to get the stationary vector

For a Discrete Time Markov Chain problem, i have the following: 1) Transition matrix: 0.6 0.4 0.0 0.0 0.0 0.4 0.6 0.0 0.0 0.0 0.8 0.2 1.0 0.0 0.0 0.0 2) Initial probability vector: 1.0 0.0 0.0 0.0 So, i wrote the following SciLab…
0
votes
2 answers

markov chains stationary distribution's condition about the init state

As a property of the markov chain, the stationary distribution has been widely used in many fields like page_rank etc. However, since the distribution is just a property about the transition matrix and has nothing to do with the init state of the…
Red Lv
  • 1,369
  • 4
  • 13
  • 12
0
votes
1 answer

Matlab: PDF from a Markov Chain

I have generated the Markov Chain using Matlab. From the generated Markov Chain, I need to calculate the probability density function (PDF). How should i do it? Should I use the generated Markov Chain directly in any of the PDF…
Ram Sundar
  • 13
  • 1
  • 6
0
votes
0 answers

Invert Markov Chain

A Markov Chain's Sequence of State is fully characterized by it's birth rate l(t) (for lambda) and it's deathrate m(t) (for mu) and given an initial probability distribution P0 of the initial state. Also there is a maximum amount of states c (for…
Vladimir S.
  • 450
  • 2
  • 10
  • 23
0
votes
1 answer

Markov chain transition matrix from vector of probabilities

The complete data.frame overview: 'data.frame': 29 obs. of 3 variables: $ FirmDatum : Date, format: "1982-12-31" "1983-03-31" "1983-06-30" ... $ fittedSurv: num 0.884 0.839 0.779 0.746 0.817 ... $ Rating : chr "Aa" "Aaa" "B" "Bb" ... The…
Maximilian
  • 4,177
  • 7
  • 46
  • 85
0
votes
2 answers

Track text cursor, to display menu above

I want to build predictive sentence advisor, like (Onion News) Apple Introduces Revolutionary New .... Sentences can probably be generated by stylized pseudo-random text generating algorithm like Markov chain. I imagine this as - while typing,…
Margus
  • 19,694
  • 14
  • 55
  • 103
0
votes
1 answer

Advanced video analysys - how to bite it?

I need to do an application that: captures video data from camcorder, do some processing (Monte Carlo methods, Markov's fields and chains, etc.), saves the data as video file, enriched with upper surface containing information about the processing…
0
votes
2 answers

Count the number of times a string appears in a sequence

I have a matrix X which comprises of some sequences I have from a Markov Chain. I have 5 states 1,2,3,4,5. So for example row 1 is a sequence and row 2 an separate independent sequence. 4 4 4 4 4 5 3 0 0 0 1 4 2 2 2…
HCAI
  • 2,213
  • 8
  • 33
  • 65
0
votes
1 answer

Comparing and visualising groups of sequences

I have two groups A and B of strings of the letters "AGTE" and I'd like to find some way of comparing these to see whether they are statistically similar. The first group A are real world observations, B are predictions. There are 400 or so in each…
HCAI
  • 2,213
  • 8
  • 33
  • 65
0
votes
2 answers

Generating a Markov model from a matrix

The definition may be wrong, so please correct me if that is so.. I need to generate a Markov model from a matrix of a following kind: four two "e" four two "e" three three "e" one three "e" zero …
Stpn
  • 6,202
  • 7
  • 47
  • 94
-1
votes
1 answer

is it possible to have a variable amount of elifs in an if/elif/else chain?

I am playing around with Markov-Chains and depending on the number of states I need the same amount of ifs. So if I happen to have a chain with 4 (from 0 to 3) states, my logic is as follows: startingstate = if state == 0: …
Stefan 44
  • 157
  • 1
  • 9
-1
votes
1 answer

How to execute one part of the code many times ~400 and output the results to a file, with one item per line?

I am working on a system that generates a language for use in fantasy storytelling and need a Markov Generator for it. I was able to find an open source Markov Generator in Python, and modify it to work for one word at a time. Problem is I don't…
-1
votes
1 answer

Compute the equilibrium probabilities of the Markov chain using Jacobi Iteration in Python

I am trying to compute a function that calculates the equilibrium probabilities of a Markov chain. For this problem, is have already got my transition matrix. Now I am trying to define a function called Jacobi but confused about the most effective…
-1
votes
2 answers

Checking to see if a Object already exists and executing a bit of code if it does

This is in Python. I want to write a simple Markov-Chain thingy in about 75 lines of code on my Calculator. the only Module im able to import is "Random" and "Math". heres a quick copy i made, should work in the python terminal, where it is supposed…