Questions tagged [markov]

Markov, or markov property refers to the memoryless property of a stochastic process.

Overview

From Wikipedia,

A stochastic process has the Markov property if the conditional probability distribution of future states of the process depends only upon the present state, not on the sequence of events that preceded it.

A Markov process (Yt) can be expressed like below:

enter image description here

Tag usage

Please consider stack-exchange's Cross Validated SE for asking questions concerning statistics data analysis.

255 questions
1
vote
0 answers

Markov Chain does not converge

i am trying to model car urban mobility by using Markov chains. I am trying to figure out why my Markov Chain model will not converge to a steady state distribution, assuming that it follows the assumptions of aperiodocity, ergodicity etc. I tried…
Fillip
  • 31
  • 2
1
vote
1 answer

list[-1] yields "list index out of range" error in python program

I'm trying to build a markov generator that takes an arbitrary length for the chain of words as a programming exercise, but I've found a bug I just can't quite seem to fix. When I run the markov function, I get list index out of range. I get the…
TVarmy
  • 75
  • 1
  • 5
1
vote
0 answers

RCPP and the %*% operator, revisited

I'm trying to decide if it makes sense to implement R's %*% operator in RCpp if my dataset is huge. BUT, I am really having trouble getting a RCpp implementation. Here is my example R code # remove everything in the global environment rm(list =…
SmittyBoy
  • 289
  • 3
  • 12
1
vote
1 answer

Log probability in the Viterbi algorithm (handling zero probabilities)

I am coding a probabilistic part of speech tagger in Python using the Viterbi algorithm. In this context, the Viterbi probability at time t is the product of the Viterbi path probability from the previous time step t-1, the transition probability…
1
vote
0 answers

Counting the frequency of n-grams in a sample text file

So, I'm implementing a Markov random text generator in Java, and I've gotten as far as plucking out the n-grams in the text file, but now I'm struggling to write a class that gives the number of occurrences of the n-grams in the text (and eventually…
1
vote
1 answer

Getting node membership in each cluster

I am using Markov clustering to cluster a graph of 878 nodes. The implementation is based on the work mentioned here https://github.com/guyallard/markov_clustering adj_matrix = nx.to_numpy_matrix(G) res = mcl.run_mcl(adj_matrix) clusters =…
Taie
  • 1,021
  • 16
  • 29
1
vote
1 answer

How to write recursive call in Markov Chain W/O runtine error

SO here is the method i call from the main. the first part just multiplies the methods together. It's the return that gets the errors and line 121. I emphisized the lines that get the errors. I want to multiply the matrix that was created with the…
1
vote
1 answer

Text file as input in C++ program will not work unless the text is copy and pasted

I have a very strange bug in my code that is a little hard to explain. Let me begin with what the program does: basically, the C++ program takes input text (from a file named "input.txt" in the same directory) and uses Markov Chains to generate some…
LukeZ1986
  • 25
  • 5
1
vote
0 answers

Anomaly detection using Markov chains

I'm trying to detect anomalies using Markov Chains. I have a training dataset with a sequence of events that I used to create a probability transition matrix. Then, I create another matrix using a test dataset. I'm looking for a way to compare these…
1
vote
1 answer

How To Calculate the values for an HMM?

I need help understanding and solving two questions from HMM. Please see these matrices, where the hidden states are H = happy and S = sad. Pi is the initial probability table P(x[t] | x[t-1]) is the transition table and p(y[t] | x[t]) is the…
1
vote
1 answer

Is it possible to use the Markov Blanket to determine whether two nodes are conditionally independent?

A target node is independent of all other nodes in a Bayesian network given its Markov Blanket. I am confused how this can be applied. Can I for example target any node in the graph to determine its independence from another node? Consider this…
1
vote
0 answers

Higher-order or semi-Markov process

I would like to build a Markov chain with which I can simulate the daily routine of people (activity patterns). Each simulation day is divided into 144-time steps and the person can carry out one of fourteen activities. I have already built the…
Rfanatic
  • 2,224
  • 1
  • 5
  • 21
1
vote
0 answers

How to Fit my continuous data with hmmlearn GMMHMM library in python?

I am trying to implement HMM Learning/training with the continuous data set which is in sequential form. I have tried to construct a new HMM training with Gaussian Mixture/EM Algo, but I have been facing some issues, so i switch to hmmlearn library…
Mari
  • 698
  • 1
  • 8
  • 27
1
vote
0 answers

Markov Decision Process absolute clarification of what states have the Markov property

I seem to consistently encounter counter-examples in different texts as to what states constitute having the Markov property. It seems some presentations assume a MDP to be one in which the current state/observation relays absolutely all necessary…
user4779
  • 645
  • 5
  • 14
1
vote
0 answers

Cannot fit Markov model in R

I am doing Markov process for my assignment using R. I already designed my matrix but the problem is I cannot put my matrix into markov object. The error is saying .. Error in validObject(.Object) : invalid class “markovchain” object: 1: Error!…