Questions tagged [markov]

Markov, or markov property refers to the memoryless property of a stochastic process.

Overview

From Wikipedia,

A stochastic process has the Markov property if the conditional probability distribution of future states of the process depends only upon the present state, not on the sequence of events that preceded it.

A Markov process (Yt) can be expressed like below:

enter image description here

Tag usage

Please consider stack-exchange's Cross Validated SE for asking questions concerning statistics data analysis.

255 questions
0
votes
0 answers

My text generator with Markov chains routine is incomplete how to approach a pos/word/freq data structure?

I want to create a simple text generator with markov chains. I don't understand how the java 'random' routines are used and what datastructures to use? For example, let's say I have a routine to load a document and then a markov routine to generate…
Berlin Brown
  • 11,504
  • 37
  • 135
  • 203
0
votes
1 answer

Markov Chains and equilibrium probability

The fire danger during the summer in Mount Baker National Forest is classified into one of three danger levels. These are 1 =low, 2 =moderate, 3 =high. The probability of daily transitions between these states is given by the following flow…
0
votes
1 answer

Can I make an unchordal MRF equivalent to a chordal MRF?

Here BY equivalence I mean, will the distribution(Entire table) be made equal in both cases???
0
votes
1 answer

Framework of Cart Pole w/ Reinforcement Learning

I am working on a side project that is modelling a the inverted pendulum problem and solving it with a reinforcement learning algorithm, most notably Q-Learning. I have already engineered a simple MDP solver for a grid world - easy stuff. However, I…
0
votes
0 answers

gmrf model for images

Can anyone explain how the parameters of the GMRF model can be estimated for an image using MATLAB? I have tried the toolboxes like UGM.(http://www.di.ens.fr/~mschmidt/Software/UGM/trainMRF.html)
Abhishek Thakur
  • 16,337
  • 15
  • 66
  • 97
0
votes
1 answer

How to randomly change the transition matrix using by Markov chain?

I'm using a 1st step Transition matrix to generate the DNA sequences. Now I need to give a probability to the transition matrix to change every 1000 steps. Let's say, every 1000 steps, there is 40% probability the transition matrix will…
Frank
  • 19
  • 1
  • 5
0
votes
1 answer

Why am I getting an IndexError from a random choice function?

I am trying to run this code from the beginners' python book Think Python to do Markov Analysis on a text file. When I run the code provided as a solution, I get an IndexError: List index out of range from the random.py module. What do I need to…
Cass
  • 870
  • 8
  • 21
0
votes
2 answers

Markov entropy when probabilities are uneven

I've been thinking about information entropy in terms of the Markov equation: H = -SUM(p(i)lg(p(i)), where lg is the base 2 logarithm. This assumes that all selections i have equal probability. But what if the probability in the given set of choices…
InvalidBrainException
  • 2,312
  • 8
  • 32
  • 41
-1
votes
1 answer

What does mean that P(λ) is the prior probability in a Hidden Markov Model?

Given the following parameters: λ = (A,B,π). A = the state transition matrix A = { a[i][j] } = { P(state q[i] at t | state q[j] at t+1) }, B = the observation matrix and π = the initial distribution. It is correct the sentence below? (making…
JMFS
  • 297
  • 1
  • 4
  • 11
-1
votes
1 answer

How to convert pandas dataframe into a transaction matrix

I want to convert my pandas dataframe into a markov chain transaction matrix import pandas as pd dict1={'state_num_x': {0: 0, 1: 1, 2: 1,3: 1,4: 2,5: 2,6: 2,7: 3,8: 3,9: 4,10: 5,11: 5, 12: 5,13: 5,14: 5,15: 5,16: 6,17: 6,18: 6,19:…
-1
votes
1 answer

Question about constructing transition matrix for a scenario

I am facing some problem on constructing transition probability matrix when I am studying and following is the scenario of the question: **Assuming a phone has had i faults (for i = 0,1,2,3 the probability of having another fault is p, independently…
Ben
  • 1
  • 1
-1
votes
1 answer

How do well perform a spatial clustering (Lat/Lng) analysis with Markov Cluster Algorithm

I have a database where GPS Points are saved. They represent POI for customers. Now I want to group them by proximity for construct unified communications for the found groups. It's a marketing purpose ! I think that MCL approach can help me because…
soprath
  • 1
  • 1
-1
votes
2 answers

Finding conditional and joint probabilities from a simulation

Consider the Markov chain with state space S = {1, 2} and transition matrix and initial distribution α = (1/2, 1/2). Suppose, the source code for simulation is the following: alpha <- c(1, 1) / 2 mat <- matrix(c(1 / 2, 0, 1 / 2, 1), nrow = 2, ncol…
user366312
  • 16,949
  • 65
  • 235
  • 452
-1
votes
2 answers

Markov chain generator

The generator should take a starting point (an integer). With each pass of the resultant generator object to next, a random step from the last point returned (or the starting point if no point has yet been returned) should be performed. The result…
roadrunner
  • 45
  • 6
-1
votes
1 answer

placing the values on column header and row header

this code works perfectly fine. I just need a help to place the tuple values combination of matrix on the column as well as rows: from __future__ import division import seaborn as sns; sns.set() def transition_matrix(transitions): states = 1+…
lpt
  • 931
  • 16
  • 35
1 2 3
16
17