Questions tagged [markov-chains]

Markov chains are systems which transition from one state to another based only upon their current state. They are used widely in various statistical domains to generate sequences based upon probabilities.

Markov chains (named after their creator, Andrey Markov) are systems which transition from one state to another based only upon their current state. They are memoryless processes which are semi-random, i.e. where each state change having an associated probability.

Due to their statical nature, Markov chains are suitable for simulating complex real-life processes where probabilities are well known. They are used in a wide variety of fields, with uses too in-depth to list here; an exhaustive list can be found on the associated Wikipedia page.

In programming, they are especially popular for manipulating human languages - Markov text generators are especially popular applications of Markov chains.

577 questions
0
votes
1 answer

Two problems on writing a script to compute markov joint distribution (in python)

I'm a new-learner of python, recently I'm working on some project to perform computation of Joint distribution of a markov process. An example of a stochastic kernel is the one used in a recent study by Hamilton (2005), who investigates a nonlinear…
zlqs1985
  • 509
  • 2
  • 8
  • 25
0
votes
1 answer

Markov Chain Transition Matrix: MATLAB function sparse - index exceeds matrix dimensions

I'm referencing to this Question: Estimate Markov Chain Transition Matrix in MATLAB With Different State Sequence Lengths The described procedure worked out perfectly for me, but I'm not able to adapt the last Matlab command to create the transition…
LarsVegas
  • 65
  • 6
0
votes
2 answers

VBA coding - Generating Random Variable

I want to write a code into a VBA excel. I have prepared the algorithm that is shown below, but I do not know exactly how I will write it in VBA :/ Can someone help me with this? 1) Assign initial values: Current is 1, Year is 0, Result is empty…
Jonathan
  • 37
  • 1
  • 1
  • 7
0
votes
1 answer

Negative eigenvectors from transition matrix of a Markov chain

I have the following snippet to calculate the steady state of a transition matrix: import numpy as np import scipy.linalg as la if __name__ == "__main__": P = np.array([[0.5, 0.2 , 0.3, 0], [0.5, 0 , 0.1 , 0.4], …
0
votes
1 answer

Python markov chain, user input string to transition matrix

I'm making a weather prediction program in Python using a Markov chain. The programs asks the user for input on past days' weather and predicts the next seven days. I was wondering how I can change my transition matrix so it uses the percentages of…
Joe
  • 21
  • 1
  • 4
0
votes
1 answer

proof of each row of self product of transition matrix sums to 1

I am unable to proof that the sum of each row of self product of a transition matrix is 1... Let A be a transition probability matrix which means that each row of A sums to 1, and let P=A*A. I want the prove that P is a also a valid transition…
0
votes
1 answer

multiple transitions of Markov model

I'm trying to run a data frame through multiple transitions of a Markov model for a class. The data frame looks like this: df = pd.DataFrame({'Bull Market': [.9, .8, .5], 'Bear Market': [.25, .05, .25], …
0
votes
0 answers

unified estimation of discrete Markov Model

Background I have a multivariate dataset, say M x N, where M is the number of variables and N is the number of samples. Now, the pattern of dependencies between the M variables changes across the N samples i.e. the pattern of dependencies is…
0
votes
1 answer

c++ stack efficient for multicore application

I am trying to code a multicode Markov Chain in C++ and while I am trying to take advantage of the many CPUs (up to 24) to run a different chain in each one, I have a problem in picking a right container to gather the result the numerical…
Learning is a mess
  • 7,479
  • 7
  • 35
  • 71
0
votes
1 answer

Markov chain - Likelihood of sample with "unseen" observations (probability 0)

I have a large Markov chain and a sample, for which I want to calculate the likelihood. The problem is that some obervations or transitions in the sample don't occur in the Markov chain, which makes the total likelihood 0 (or the log-likelihood -…
0
votes
1 answer

Get previous word in a bigram model

I am trying to implement Markov chains and need to compute the probability of the previous word. I have created a data frame and tried both a mutate and a for loop. In both cases for some reason it is always returning only the 1st element's…
Tinniam V. Ganesh
  • 1,979
  • 6
  • 26
  • 51
0
votes
1 answer

MATLAB code for using a Markov chain for evaluating an entropy noise source

I am trying to integrate page 68 of this PDF into MATLAB code: http://csrc.nist.gov/publications/drafts/800-90/draft-sp800-90b.pdf#page=68 I have included these instructions as an image here: As I know nothing or very little of Markov chains, I am…
user3788941
0
votes
2 answers

Very large, very sparse Markov Transition Matrices

I have 10 variables, with 10 individual states (deciles), and I am trying to create a 2D markov transition matrix. This would imply a matrix of 10^10 rows and 10^10 columns, which would be very sparse. This is far too large to work with, but I am…
lukehawk
  • 1,423
  • 3
  • 22
  • 48
0
votes
1 answer

How to solve 2D Markov Chains with Infinite State Space

I have 2 dimensional markov chain and I want to calculate steady state probabilities and then basic performance measurements such as expected number of customers, expected waiting time, etc. You can check the transition rate diagram link…
alamaranka
  • 103
  • 6
0
votes
1 answer

For Loop error in r

I am trying to simulate a markov chain by using code that wasnt made for it. My for loop is not correct and I continue to get error messages but I kept track of the brackets and syntax but I cannot quite put my finger on the issue. The matrix for…
William Bernard
  • 359
  • 4
  • 15