markov-chains

Decoding sequences in a GaussianHMM

随声附和 提交于 2021-02-06 15:18:58
问题 I'm playing around with Hidden Markov Models for a stock market prediction problem. My data matrix contains various features for a particular security: 01-01-2001, .025, .012, .01 01-02-2001, -.005, -.023, .02 I fit a simple GaussianHMM: from hmmlearn import GaussianHMM mdl = GaussianHMM(n_components=3,covariance_type='diag',n_iter=1000) mdl.fit(train[:,1:]) With the model (λ), I can decode an observation vector to find the most likely hidden state sequence corresponding to the observation

Decoding sequences in a GaussianHMM

蹲街弑〆低调 提交于 2021-02-06 15:18:22
问题 I'm playing around with Hidden Markov Models for a stock market prediction problem. My data matrix contains various features for a particular security: 01-01-2001, .025, .012, .01 01-02-2001, -.005, -.023, .02 I fit a simple GaussianHMM: from hmmlearn import GaussianHMM mdl = GaussianHMM(n_components=3,covariance_type='diag',n_iter=1000) mdl.fit(train[:,1:]) With the model (λ), I can decode an observation vector to find the most likely hidden state sequence corresponding to the observation

Decoding sequences in a GaussianHMM

删除回忆录丶 提交于 2021-02-06 15:17:34
问题 I'm playing around with Hidden Markov Models for a stock market prediction problem. My data matrix contains various features for a particular security: 01-01-2001, .025, .012, .01 01-02-2001, -.005, -.023, .02 I fit a simple GaussianHMM: from hmmlearn import GaussianHMM mdl = GaussianHMM(n_components=3,covariance_type='diag',n_iter=1000) mdl.fit(train[:,1:]) With the model (λ), I can decode an observation vector to find the most likely hidden state sequence corresponding to the observation

Obtaining the stationary distribution for a Markov Chain using eigenvectors from large matrix in MATLAB

ぐ巨炮叔叔 提交于 2021-01-28 04:04:45
问题 I am trying to find the stationary distribution of a Markov chain. I have a transition probability matrix (TPM). Here is the code: [V, D] = eigs(double(TPM'),1); Py = abs(V)/sum(V); My problem is that sum(V) < 0 so it gives me some negative values in the vector Py . I've tested the code using another probability matrix and it gives sum(V) > 0 . I don't know what is the problem, is it the TPM or the code I am using? EDIT: Here is a more "elaborated" version of the code (including the answer of

Directed probability graph - algorithm to reduce cycles?

早过忘川 提交于 2020-08-22 05:00:19
问题 Consider a directed graph which is traversed from first node 1 to some final nodes (which have no more outgoing edges). Each edge in the graph has a probability associated with it. Summing up the probabilities to take each possible path towards all possible final nodes returns 1 . (Which means, we are guaranteed to arrive at one of the final nodes eventually.) The problem would be simple if loops in the graph would not exist. Unfortunately rather convoluted loops can arise in the graph, which

Directed probability graph - algorithm to reduce cycles?

混江龙づ霸主 提交于 2020-08-22 04:59:43
问题 Consider a directed graph which is traversed from first node 1 to some final nodes (which have no more outgoing edges). Each edge in the graph has a probability associated with it. Summing up the probabilities to take each possible path towards all possible final nodes returns 1 . (Which means, we are guaranteed to arrive at one of the final nodes eventually.) The problem would be simple if loops in the graph would not exist. Unfortunately rather convoluted loops can arise in the graph, which

Hidden test cases not passing for Google Foobar Challenge Doomsday Fuel [closed]

大兔子大兔子 提交于 2020-08-08 06:06:22
问题 Closed. This question needs debugging details. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed last month . Improve this question I'm working my way through the Google Foobar challenge and am now at the level 3 challenge Doomsday Fuel. The instructions are as follows: Doomsday Fuel Making fuel for the LAMBCHOP's reactor core is a tricky process because of the exotic matter involved. It starts as raw ore,

Fit and evaluate a second order transition matrix (Markov Process) in R?

喜夏-厌秋 提交于 2020-03-26 08:33:47
问题 I am trying to build a second-order Markov Chain model , now I am try to find transition matrix from the following data. dat<-data.frame(replicate(20,sample(c("A", "B", "C","D"), size = 100, replace=TRUE))) Now I know how to fit the first order Markov transition matrix using the function markovchainFit(dat) in markovchain package. Is there any way to fit the second order transition matrix? How do evaluate the Markov Chain models? i.e. Should I choose the first order model or second order

Creating three-state Markov chain plot

。_饼干妹妹 提交于 2020-01-24 09:46:06
问题 I have following dataframe with there states: angry, calm, and tired. The dataframe below provides individual cases of transition of one state into another. pre<-cbind(c(rep("tired",100),rep("angry",100),rep("calm",100))) post<-cbind(c(rep("tired",50),rep("angry",70),rep("calm",100),rep("tired",80))) df<-cbind(pre,post) df<-as.data.frame(df) colnames(df)<-c("pre","post") What I would like to achieve is buidling a Markov's chain plot for three states that is also called "playground" and looks