Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sampler remembering values from previous runs #2135

Closed
athulsudheesh opened this issue Nov 25, 2023 · 1 comment
Closed

Sampler remembering values from previous runs #2135

athulsudheesh opened this issue Nov 25, 2023 · 1 comment

Comments

@athulsudheesh
Copy link

Below I am trying to build a bayesian network model. As part of model checking, I was comparing the prior distribution to posterior distribution. For my model the posterior is same as prior, no change. But what's weird is that if I change the prior to a new value (say mean of the Normal dist. for parameter m from 5 to 0.5), in the new run my new posterior mean for parameter m will be centered around 5 (which was the prior and posterior mean from previous run). I am also attaching a plot that illustrates the problem.

using Pkg
Pkg.activate(".")
Pkg.add(Turing, LazyArrays, Plots, StatsPlots, DataFrames)
using Turing;
using LazyArrays;
using Plots, StatsPlots, DataFrames

# Generating Fake Data 
nstudents = 40 
nitems = 5
m = [1.2, 0.5]
b = [-0.5, 0.25, 0.125, 1.45, 1.00]
eff_theta = Matrix(undef, nstudents, nitems);
lambda_OA = [0.8, 0.2]
AF = rand(Categorical([0.6, 0.4]), 40)
OA = rand(Categorical(lambda_OA),40);
for i in 1:nstudents 
    for j in 1:nitems
        eff_theta[i,j] = minimum([m[1]*OA[i], m[2]*AF[i]]) - b[j]
    end
end

X = rand.(BernoulliLogit.(eff_theta))

# Model Definition 
@model function bn(X,AF)
    nstudents, n_items = size(X)
    θ = Matrix(undef, nstudents, n_items)

    m ~ filldist(truncated(Normal(5,0.1), 0, Inf), 2)
    b ~ filldist(Normal(0,1), n_items)

    λ_OA ~ Dirichlet([2,2])                            
    OA ~ filldist(Categorical(λ_OA), nstudents)  

    for i in 1:nstudents 
        for j in 1:n_items
            θ[i,j] = minimum([m[1]*OA[i], m[2]*AF[i]]) - b[j]
        end
    end
    X .~ BernoulliLogit.(θ)
end

prior_chain = sample(bn(X,AF), Prior(),1000)

post_chain = sample(bn(X,AF), 
                SMC(), 
                MCMCThreads(),2_000, 4)

begin
        p1 = density(prior_chain[Symbol("m[1]")], color=:black, linewidth=1.8, label="prior")
        density!(p1, post_chain[Symbol("m[1]")])
end 
Screenshot 2023-11-25 at 12 44 39 PM
@athulsudheesh
Copy link
Author

athulsudheesh commented Dec 9, 2023

Figured out there are some issues with the model definition. So closing this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant