Questions tagged [markov-chains]

Markov chains are systems which transition from one state to another based only upon their current state. They are used widely in various statistical domains to generate sequences based upon probabilities.

Markov chains (named after their creator, Andrey Markov) are systems which transition from one state to another based only upon their current state. They are memoryless processes which are semi-random, i.e. where each state change having an associated probability.

Due to their statical nature, Markov chains are suitable for simulating complex real-life processes where probabilities are well known. They are used in a wide variety of fields, with uses too in-depth to list here; an exhaustive list can be found on the associated Wikipedia page.

In programming, they are especially popular for manipulating human languages - Markov text generators are especially popular applications of Markov chains.

577 questions
0
votes
1 answer

Making a matrix that counts the number of companies that were rating i in 1996 and moved to rating j in 1997 and rating k in 1998

I'm trying to make a matrix using a function that counts the number of "ID"s that were in "RATING" i in "YEAR" 1996 and then moved to "RATING" k in "YEAR" 1997 and then moved to "RATING" k in "YEAR" 1998. I believe the row labels of the matrix would…
gm007
  • 547
  • 4
  • 11
0
votes
1 answer

markov chain simulation using R

I want to plot a graph of probability distributions of states in different time t. I have the transition matrix P1 <- matrix(c(0, 1, 0, 0, 0, 0, 2/3, 1/3, 0, 1, 0, 0, 0, 0, 0, 1), 4, 4, byrow=TRUE) To plot a graph from p1 to p50, I use the…
Berry
  • 47
  • 5
0
votes
1 answer

Markov Clustering in Python

As the title says, I'm trying to get a Markov Clustering Algorithm to work in Python, namely Python 3.7 Unfortunately, it's not doing much of anything, and it's driving me up the wall trying to fix it. EDIT: First, I've made the adjustments to the…
Leafsw0rd
  • 157
  • 8
0
votes
1 answer

Exporting the Transition Matrix from a markovchain object

I had a sequence of states organized as a data frame that looks like this: Year1 Year2 Year3 ... 1 2 5 ... 3 9 4 ... I used markovchain's markovchainListfit function because I'd like to construct the transition matrix for…
Joseph
  • 65
  • 1
  • 8
0
votes
1 answer

Confusion in understanding Q(s,a) formula for Reinforcement Learning MDP?

I was trying to understand the proof why policy improvement theorem can be applied on epsilon-greedy policy. The proof starts with the mathematical definition - I am confused on the very first line of the proof. This equation is the Bellman…
0
votes
1 answer

Extract coefficients after msmFit

I'm trying to extract the parameter estimates (including the matrix of transition probabilities) after running a msmFit model. However, I get the following error message: summary(msmIre <- msmFit(modIre, 2, sw=rep(TRUE,2))) coef(msmIre) Error: $…
0
votes
0 answers

Using lists as VALUE in a dictionary

UPDATE: I figured out my issue gram = notesNumbers[i:(i+1)]; When I print gram it prints it as an array with only 1 value. For example it would print [55]. Later in my code im referencing gram, however its using the whole array instead of the one…
Adilp
  • 429
  • 1
  • 7
  • 21
0
votes
1 answer

What do 'random jumps' in Google's pageRank really mean?

I read somewhere that the added S matrix of 1/n elements together with the fudge factor 0.15 which Google uses is just not accurate and just comes to solve another problem. On the other hand I have read somewhere else that it does have a meaning. …
bilanush
  • 139
  • 8
0
votes
0 answers

Algorithm For Determining Sentence Subject Similarity

I'm looking to generate an algorithm that can determine the similarity of a series of sentences. Specifically, given a starter sentence, I want to determine if the following sentence is a suitable addition. For example, take the following: My dog…
Bryant Makes Programs
  • 1,493
  • 2
  • 17
  • 39
0
votes
0 answers

How can I set the states (names) after using markovchainFit for plotting?

here is my contrived dataset/logic to plot markovchain sequences <- list( c("Opened", "2709342", "END"), c("Opened", "3067630", "END") ) sequencesMarkovchainFit <- function(sequences){ # Create a continuous sequence string…
Susol
  • 1
  • 2
0
votes
0 answers

Transition probabilities in Markov chain do not sum to 1

I'm having a bit of an issue with transitions within a Markov chain when the conditional probabilities that describe the transitions have more than one significant digit. For example, if the conditional probabilities are: eps_a <- 0.3 # Pr(a…
mfidino
  • 3,030
  • 1
  • 9
  • 13
0
votes
0 answers

Log likelihhood of a new sequence based on a trasition probability in Markov chain

Suppose I have a transition matrix like this: A B C A 0.2 0.3 0.5 B 0.8 0.1 0.1 C 0.3 0.3 0.4 I can get the log likelihood of this by the following formula: $L_{mc}=\sum_{i=1}^M\sum_{j=1}^M.N_{ij}.ln(T_{ij})$ If I have a…
Lzz0
  • 423
  • 1
  • 4
  • 13
0
votes
1 answer

text generation using markov chains in R

i'm trying to apply markov chains algorithm for a simple text generation and i found a code from internet and made changes to fit my data as follows library(markovchain) library(tidyverse) library(tidytext) library(stringr) #use readLines to read…
hareen tej
  • 89
  • 1
  • 3
  • 9
0
votes
0 answers

Transition Matrix for Markov chain

The data frame looks like this: ID path conversion 1 A 1 2 B 0 3 A 0 4 B > C 0 5 A 0 6 A > A 0 I have created the markov model using the function: ma <-…
deez
  • 50
  • 8
0
votes
2 answers

Error during MoveHMM for location data

I am using MoveHMM package (https://cran.r-project.org/web/packages/moveHMM/vignettes/moveHMM-guide.pdf) for HMM analysis but i am getting below mentioned error when i plot. Error in if (max(stepDensities[[state]][, 2]) > maxdens) maxdens <-…
Saara
  • 105
  • 12