Inserting jagam code into runjags (JAGS) model

时光毁灭记忆、已成空白 提交于 2021-02-10 18:55:19

问题


I’ve been trying to incorporate smoothing into a runjags model that I’ve created to model seabird burrow numbers and distribution across an island. I’ve managed to generate some smoothing code by extracting the count data and x and y coordinates from the model output and using the JAGAM tutorial on this page http://www.petrkeil.com/?p=2385

I think I might be able to improve model performance by incorporating the smoothing into the jags model but I’m at a loss as to how to do this. Can you offer me any pointers on how to achieve this?

I’ve attached a section of the runjags code and the JAGAM output below.

runjags code:

for(i in 1:K) { 
S1[i]~dpois(lambda1[i])
SS1[i]~dpois(lambda1[i])
lambda1[i]<-exp(a0+
a1*Tussac[i]+
a2*normalise_DEM_aspect[i]+
a3*normalise_DEM_slope[i]+
a4*Tussac[i]*normalise_DEM_aspect[i]+
a5*Tussac[i]*normalise_DEM_slope[i]+
a6*normalise_sentinel1[i]+
a7*normalise_setinel3[i]+
a8*normalise_sentinel4[i]+
a9*normalise_sentinel5[i]+
a10*normalise_sentinel8[i]+
a11*normalise_sentinel10[i]+
a12*S2[i])
}

JAGAM output:

readLines("jagam.bug")
"model {"                                                        
"  eta <- X %*% b ## linear predictor"                           
"  for (i in 1:n) { mu[i] <-  exp(eta[i]) } ## expected response"
"  for (i in 1:n) { y[i] ~ dpois(mu[i]) } ## response "          
"  ## Parametric effect priors CHECK tau=1/35^2 is appropriate!" 
"  for (i in 1:1) { b[i] ~ dnorm(0,0.00083) }"                   
"  ## prior for s(x,y)... "                                      
"  K1 <- S1[1:29,1:29] * lambda[1]  + S1[1:29,30:58] * lambda[2]"
"  b[2:30] ~ dmnorm(zero[2:30],K1) "                             
"  ## smoothing parameter priors CHECK..."                       
"  for (i in 1:2) {"                                             
"    lambda[i] ~ dgamma(.05,.005)"                               
"    rho[i] <- log(lambda[i])"                                   
"  }"                                                            
"}" 

Sample data:

S1 Logit_tussac soil_moisture DEM_slope DEM_aspect DEM_elevation sentinel1 sentinel2 sentinel3 sentinel4 sentinel5 sentinel6 sentinel7 sentinel8 sentinel9 sentinel10
NA          NA            NA 14.917334   256.1612      12.24432    0.0513    0.0588    0.0541    0.1145    0.1676    0.1988    0.1977    0.1658    0.1566     0.0770
0    -9.210240             1 23.803741   225.1231      16.88028    0.1058    0.1370    0.2139    0.2387    0.2654    0.2933    0.3235    0.2928    0.3093     0.1601
NA          NA            NA 20.789165   306.0945      18.52480    0.0287    0.0279    0.0271    0.0276    0.0290    0.0321    0.0346    0.0452    0.0475     0.0219
NA   -9.210240             1  6.689442   287.9641      36.08975    0.0462    0.0679    0.1274    0.1535    0.1797    0.2201    0.2982    0.2545    0.4170     0.2252
0    -9.210240             1 25.476444   203.0659      23.59964    0.0758    0.1041    0.1326    0.1571    0.2143    0.2486    0.2939    0.2536    0.3336     0.1937
1    -1.385919             3  1.672511   270.0000      39.55215    0.0466    0.0716    0.1227    0.1482    0.2215    0.2715    0.3334    0.2903    0.3577     0.1957

回答1:


This is a very good question, and a good idea to use the (extremely useful) output of jagam to add a GAM term to your model. What I would recommend in your case is to use jagam to generate only the GAM term(s) and nothing else (not even an intercept), and then to copy/paste the relevant sections of the jagam model output to your existing model code as well as taking the X data variable from jagam and using it as your data. This is easiest to demonstrate with an example:

First simulate some data with a single linear term X1 and a single non-linear term X2 (in this case polynomial but that doesn't matter):

library('runjags')
library('mgcv')

set.seed(2018-09-06)

N <- 100
dataset <- data.frame(X1 = runif(N,-1,1), X2 = runif(N,-1,1))
dataset$ll <- with(dataset, 1 + 0.15*X1 + 0.25*X2 - 0.2*X2^2 + 0.15*X2^3 + rnorm(N,0,0.1))
dataset$Y <- rpois(N, exp(dataset$ll))

# Non-linear relationship with log lambda:
with(dataset, plot(X2, ll))

Then run jagam BUT make sure to exclude the intercept term by specifying 0 + in the right hand side:

# Get the JAGAM stuff excluding intercept:
jd <- jagam(Y ~ 0 + s(X2), data=dataset, file='jagam.txt',
    sp.prior="gamma",diagonalize=TRUE,family='poisson')

Alternatively you could leave the intercept term here and remove it from your model. This gives us a jagam.txt file that looks like:

model {
  eta <- X %*% b ## linear predictor
  for (i in 1:n) { mu[i] <-  exp(eta[i]) } ## expected response
  for (i in 1:n) { y[i] ~ dpois(mu[i]) } ## response 
  ## prior for s(X2)... 
  for (i in 1:8) { b[i] ~ dnorm(0, lambda[1]) }
  for (i in 9:9) { b[i] ~ dnorm(0, lambda[2]) }
  ## smoothing parameter priors CHECK...
  for (i in 1:2) {
    lambda[i] ~ dgamma(.05,.005)
    rho[i] <- log(lambda[i])
  }
}

You can remove the first and last lines, as well as the two lines starting with for (i in 1:n) as we will replicate those ourselves. Now copy the entire remaining contents of the file, and go to your (non-GAM) model with only linear predictors (and/or random effects or whatever else) - for example:

model <- 'model{

    for(i in 1:N){      
        log(mean[i]) <- intercept + coef*X1[i]
        Y[i] ~ dpois(mean[i])
    }

    # Our priors:
    intercept ~ dnorm(0, 10^-6)
    coef ~ dnorm(0, 10^-6)

    #data# N, X1, Y
    #monitor# intercept, coef
}'

Then paste the GAM bit you copied into the end, so you get:

model <- 'model{

    for(i in 1:N){      
        log(mean[i]) <- intercept + coef*X1[i] + eta[i]
        Y[i] ~ dpois(mean[i])
    }

    # Our priors:
    intercept ~ dnorm(0, 10^-6)
    coef ~ dnorm(0, 10^-6)

    #data# N, X1, Y, X
    #monitor# intercept, coef, b, rho

    ## JAGAM
    eta <- X %*% b ## linear predictor
    ## prior for s(X2)... 
    for (i in 1:8) { b[i] ~ dnorm(0, lambda[1]) }
    for (i in 9:9) { b[i] ~ dnorm(0, lambda[2]) }
    ## smoothing parameter priors CHECK...
    for (i in 1:2) {
    lambda[i] ~ dgamma(.05,.005)
    rho[i] <- log(lambda[i])
    }
    ## END JAGAM    
}'

Notice the addition of the + eta[i] to the GLM line to take into account the GAM term(s), as well as adding b and rho to the monitors. That should be all you need to do for the model (except to check the priors for smoothing parameters etc as suggested).

Then we need to extract the new X data variable for use with JAGS:

X <- jd$jags.data$X

You could also extract the initial values for b and lambda if needed. Finally, we can run the model using runjags:

results <- run.jags(model, n.chains=2, data=dataset)
results

Of course, this silly example gains nothing by putting jagam code inside the simpler model - jagam could have created the entire model (including intercept and linear predictor) for us. But this approach may have value when adding a relatively small GAM component to a larger and pre-existing model that has been written to use some of the features in runjags...

If we want to use sim2jam to go back and use the relevant diagnostic/helper functions from mgcv on the fitted runjags object, it is currently necessary to call rjags directly to get more samples:

library('rjags')
sam <- jags.samples(as.jags(results), c('b','rho'), n.iter=10000)
jam <- sim2jam(sam,jd$pregam)
plot(jam)

There are two things missing here:

1) The capability to use sim2jam without needing to do more samples in rjags. This needs some additions to the mcarray class within the rjags package, which I am currently working on.

2) The capability for template.jags() to do all of this automagically for you - this is on my list of things to implement in the future.

Hope that helps - I'd be interested to hear how you get on.

Matt



来源:https://stackoverflow.com/questions/51830657/inserting-jagam-code-into-runjags-jags-model

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!