Questions tagged [markov]

Markov, or markov property refers to the memoryless property of a stochastic process.

Overview

From Wikipedia,

A stochastic process has the Markov property if the conditional probability distribution of future states of the process depends only upon the present state, not on the sequence of events that preceded it.

A Markov process (Yt) can be expressed like below:

enter image description here

Tag usage

Please consider stack-exchange's Cross Validated SE for asking questions concerning statistics data analysis.

255 questions
0
votes
0 answers

How can I set the states (names) after using markovchainFit for plotting?

here is my contrived dataset/logic to plot markovchain sequences <- list( c("Opened", "2709342", "END"), c("Opened", "3067630", "END") ) sequencesMarkovchainFit <- function(sequences){ # Create a continuous sequence string…
Susol
  • 1
  • 2
0
votes
1 answer

MSGARCH package in R

In a "rugarch" package garch specification looks like this: ugarchspec(variance.model = list(model = "sGARCH", garchOrder = c(1, 1), submodel = NULL, external.regressors = NULL, variance.targeting = FALSE), mean.model = list(armaOrder = c(1, 1),…
0
votes
0 answers

How to make an output of a function into a matrix

I'm quite new to R so this is a problem that I am encountering as a problem. I currently using a Markov Chain from the 1st iteration to the 45th. I need to use the subsequent n and n+1 iteration in a function like this:…
0
votes
1 answer

how to create a transition probability matrix inside the transition probability matrix?

I am a beginner in Matlab.I have to create a 3D matrix using markov chain based approach. In order to understand my question, I request you to see the picture first. This approach has a table of the big 3D matrix with velocity and acceleration (i.e…
0
votes
1 answer

How to use policy iteration to solve a general environment agent in java?

I know how to solve gird world using policy iteration method. but how can I solve a general environment? My data is like this: This is part of my data, it describe the transition model, please mention the source and destination type is String, I…
Qing Li
  • 21
  • 1
  • 5
0
votes
1 answer

Markov Sentence Generator; How to input sample text

I found a markov sentence generator for python on Github, and want to input Dr. Suess into it, but I cannot figure out how to. On the github ( https://github.com/hrs/markov-sentence-generator ) all it says is " $ ./sentence-generator.py filename…
0
votes
0 answers

Simulate transitions between states in R

So I have got a data table containing about 100 rows of data, each row representing a different person. And for each person, their transition probabilities to four different states are listed in columns. I've got a snippet of the table. What is the…
0
votes
1 answer

Adding seasonal variations to wind speed time series

Following up from an R blog which is interesting and quite useful to simulate the time series of an unknown area using its Weibull parameters. Although this method gives a reasonably good estimate of time series as a whole it suffers a great deal…
SamAct
  • 529
  • 4
  • 23
0
votes
2 answers

Build Markov Chains

I want to get the transition matrix for building a Markov chain model to build a recommender system. My data is in the form Date StudentID Subjectid 201601 123 1 201601 234 4 …
abhi
  • 53
  • 9
0
votes
0 answers

Markov generation and probabilities

I have been working various implementations of markov chains for a while, and I just want to clarify a generalisation of the chains. Generation If i want to generate a sequence of length n, we simply sample from the initial probabilities, then take…
urema
  • 721
  • 4
  • 11
  • 22
0
votes
0 answers

Why does my Markov cluster algorithm (MCL) produce NaN as a result in Matlab?

I have tried the Markov cluster algorithm (MCL) in Matalb but sadly I got a matrix of size 67*67 and all their elements have NaN. Could anyone tell me what's wrong? function adjacency_matrixmod2= mcl(adjacency_matrixmod2) % test the explanations…
F.caren
  • 85
  • 1
  • 9
0
votes
0 answers

Markov Models nltk

I have to run a Markov 0 and 1 model on a corpus, and calculate the two sentences with the highest probability. My code run, but the result is always 0.0 and it works on tokens, not sentences. Here is the code: def FrasiAccettabili(frasi,…
Irene S
  • 1
  • 1
0
votes
1 answer

Why use hidden Markov model vs. Markov model in Baum Welch algorithm

So I am trying to build the Baum Welch algorithm to do parts of speech tagging for practice. However, I am confused about using a hidden Markov Model vs. a Markov Model. Since it seems that you are losing context moving from state to state. Since…
I. Cantrell
  • 170
  • 1
  • 7
0
votes
1 answer

Hidden Markov model for classification

I am using the hidden markov model for classification problems. I need to examine the impact of number of hidden states and number of visible states on the performance of classifier? Based on my results, the increase of number of hidden or visible…
Ali
  • 1
  • 1
0
votes
2 answers

Reading into hashmap in c++

I'm working on a markov chain and have created a 2d hashmap that calculates the weighted probabilities. The output of this works fine. I'm looking to find the best way to output the next value. The way I have it at the moment isn't working…