How can I make a discrete state Markov model with pymc?

╄→гoц情女王★ 提交于 2019-12-03 02:10:28

As far as I know you have to encode the distribution of each time step as a deterministic function of the previous time step, because that's what it is--there's no randomness involved in the parameters because you defined them in the problem set-up. However, I think you're question may have been more towards finding a more intuitive way to represent the model. One alternative way would be to directly encode the time step transitions as a function of the previous time step.

from pymc import Bernoulli, MCMC

def generate_timesteps(N,p_init,p_trans):
    timesteps=np.empty(N,dtype=object)
    # A success denotes being in state 2, a failure being in state 1
    timesteps[0]=Bernoulli('T0',p_init)
    for i in xrange(1,N):
        # probability of being in state 1 at time step `i` given time step `i-1`
        p_i = p_trans[1]*timesteps[i-1]+p_trans[0]*(1-timesteps[i-1])
        timesteps[i] = Bernoulli('T%d'%i,p_i)
    return timesteps

timesteps = generate_timesteps(10,0.8,[0.001,0.5])
model = MCMC(timesteps)
model.sample(10000) # no burn in necessary since we're sampling directly from the distribution
[np.mean( model.trace(t).gettrace() ) for t in timesteps]

In case you want to look at the long run behaviour of your Markov chain, the discreteMarkovChain package may be useful. The examples show some ideas for building up a discrete state Markov chain by defining a transition function that tells you for each state the reachable states in the next step and their probabilities. You could use that same function to simulate the process.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!