-
Notifications
You must be signed in to change notification settings - Fork 220
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transpilation of pure WinBUGS code when reimplementing Prior and Posterior Prediction #2148
Comments
I think the issue here is that If the model is written as using Turing, MCMCChains, StatsPlots
@model function priorPosteriorPredictive(n; k=missing)
#----------------------------------------------------
# prior on rate θ
θ ~ Beta(1, 1)
#----------------------------------------------------
# likelihood of observed data
k ~ Binomial(n, θ)
#----------------------------------------------------
# prior predictive
# θPriorPred ~ Beta(1, 1)
# kPriorPred ~ Binomial(n, θPriorPred)
#----------------------------------------------------
# posterior predictive
# return kPostPred ~ Binomial(n, θ)
#----------------------------------------------------
end # function priorPosteriorPredictive
modelPosteriorPredictive = let k = 1
datum = k
n = 15
# priorPosteriorPredictive(n), # prior predictive without datum
priorPosteriorPredictive(n; k=datum) # posterior predictive including datum
end # let
chainPosteriorPredictive = # completely misleading
let iterations = 3000
nBurnIn = 1000
δ = 0.65
init_ϵ = 0.3
sampler = NUTS(nBurnIn, δ; init_ϵ=init_ϵ)
sample(modelPosteriorPredictive, sampler, iterations)
end # let
plot(chainPosteriorPredictive) (where I commented the three predictive variables) Alternatively, importance-sampling based samplers will likely perform better here @model function priorPosteriorPredictive(n)
#----------------------------------------------------
# prior on rate θ
θ ~ Beta(1, 1)
#----------------------------------------------------
# likelihood of observed data
k ~ Binomial(n, θ)
#----------------------------------------------------
# prior predictive
θPriorPred ~ Beta(1, 1)
kPriorPred ~ Binomial(n, θPriorPred)
#----------------------------------------------------
# posterior predictive
return kPostPred ~ Binomial(n, θ)
#----------------------------------------------------
end # function priorPosteriorPredictive
modelPosteriorPredictive = let k = 1
datum = k
n = 15
# priorPosteriorPredictive(n), # prior predictive without datum
priorPosteriorPredictive(n) | (; k=datum) # posterior predictive including datum
end # let
chainPosteriorPredictive = sample(modelPosteriorPredictive, PG(10), 3000)
plot(chainPosteriorPredictive) ( |
Hi sunxd3, |
You can specify which samplers are in charge of which variable(s) like @model function priorPosteriorPredictive(n)
θ ~ Beta(1, 1)
k ~ Binomial(n, θ)
θPriorPred ~ Beta(1, 1)
kPriorPred ~ Binomial(n, θPriorPred)
kPostPred ~ Binomial(n, θ)
return θPriorPred, kPriorPred, kPostPred
end
model = priorPosteriorPredictive(15) | (; k=1) # creating the model
chn = sample(model, Gibbs(HMC(0.05, 10, :θ), NUTS(-1, 0.65, :θPriorPred), PG(100, :k, :kPriorPred, :kPostPred)), 1000) # use HMC for `θ`, NUTS for `θPriorPred`, and PG for the rest. gives
|
@CMoebus we also have a package within Turing ecosystem that supports BUGS language directly, https://github.com/TuringLang/JuliaBUGS.jl, but currently in development and not feature complete. We'll appreciate it if you give it a try and report issues as there are definitely a lot, but we'll try to fix them ASAP. |
@sunxd3: Thank you again. Thank you for inviting me to become a JuliaBUGS.jl tester. Just a few weeks ago I started transpiling WinBugs scripts into pure Turing.jl. I liked the declarative, math-oriented style of BUGS. But at the same time, it is tedious if you need some calculations outside the BUGS language. A few years ago I switched to WebPPL. I liked its functional style. But Turing.jl and its embedding in Julia seem to be more promising. |
@CMoebus sorry for the late reply.
One of the goal of the JuliaBUGS project is to make this much easier and give user access to other Julia packages.
As a maintainer and user of Turing, thanks for the support! |
Hi,
I am a newbee to Turing.jl. So I try to reimplement the WinBUGS scripts of Lee & Wagenmakers' book BAYESIAN COGNITIVE MODELING. Now, I am stuck with the problem 'Prior and Posterior Prediction' in ch3.4. I want to stay close to the WinBUGS code with only 5 code lines. I tried to not export any values out of the model macro. But my chain and the plots generate infos which are completely misleading. I looked around and found two guys semihcanaktepe; quangtiencs doing similar reimplementations. But the circumvented the pure WinBUGS equivalent and wrote more verbose code.
For documentation purpose I attach the code below:
The text was updated successfully, but these errors were encountered: