Skip to content

Commit

Permalink
gibbs sampling
Browse files Browse the repository at this point in the history
  • Loading branch information
efmanu committed Feb 28, 2021
1 parent 6823736 commit f5a2a3c
Show file tree
Hide file tree
Showing 8 changed files with 627 additions and 175 deletions.
231 changes: 161 additions & 70 deletions docs/src/compare.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,101 +2,192 @@

### Comparison 1

This example compares the mean of samples generated using **Gibbs** and **AdvancedMH** sampling methods. There is two parameters in this examples with proposal distribution of ` P1 ~ Normal(2.0,3.0)` and `P2 ~ Normal(3.0,3.0)`. Moreover, there is no likelihood function involved in the calculation of log joint probability.
This example compares the mean of samples generated using **Gibbs** and **AdvancedMH** sampling methods. Only one group of parameter is considered with one parameter in this group. The proposal distribution of parameter is chosen as `MvNormal(zeros(2),1.0)` and sampled uisng `MH` sampler to sample independently.

```julia
#use packages
using AdvancedMH
using MCMCChains
using GibbsSampler
using Distributions
using StatsPlots

#define prior and proposal distributions
priors = [Normal(2.0,3.0), Normal(3.0,3.0)]

#log of joint probability
function logJoint(params)
logPrior= sum(logpdf.(priors, params))
return logPrior
end

# Construct a DensityModel for advanced MH.
mdl = DensityModel(logJoint)

# Set up our sampler with a joint multivariate Normal proposal for advanced MH.
spl = RWMH(MvNormal([2.0,3.0],3.0))

# Sample from the posterior using Advanced MH.
chm = sample(mdl, spl, 100000; param_names=["μ", "σ"], chain_type=Chains)

#define MCMC sampling algorithm
alg = [MH()]
sample_alg =Dict(
1 => [1, 1, Normal(2.0,3.0)],
2 => [1, 1, Normal(3.0,3.0)]
using GibbsSampler, Distributions
using AdvancedMH, MCMCChains

#define MCMC samplers

alg = [MH(), adHMC()]

#define variable groups and sampling methods
sample_alg = Dict(
:n_grp => 1,
1 => Dict(
:type => :ind,
:n_vars => 1,
1 => Dict(
:proposal => MvNormal(zeros(2),1.0),
:n_eles => 2,
:alg => 1
)
)
)
prior = [MvNormal([2.0,3.0],1.0)]
logJoint(params) = sum(logpdf.(prior, params))
#sample using gibbs sampler
chn = gibbs(alg, sample_alg, logJoint, itr = 10000, chain_type = :mcmcchain)

# Sample from the posterior using Gibbs sampler.
chn = GibbsSampler.gibbs(alg, sample_alg, logJoint;itr = 100000, chain_type = :mcmcchain)

```
proposalmh = [MvNormal(zeros(2),1.0)]
model = DensityModel(logJoint)
spl = RWMH(proposalmh)
chain = sample(model, spl, 10000; chain_type=Vector{NamedTuple})

The results are as below:
#show nmean of each elements
p1 = [chain[i][:param_1][1] for i in 1:length(chain)]
@show mean(p1) mean(chn["param[1][1]"])
```
```
mean(p1) = 2.0386819762387196
mean(chn["param[1][1]"]) = 1.9957525645926026
1.9957525645926026
```julia
plot(chm)
```
![Samples generated using AdvancedMH.jl](assets/chm1.png)
```julia
plot(chn)
p2 = [chain[i][:param_1][2] for i in 1:length(chain)]
@show mean(p2) mean(chn["param[1][2]"])
```
```
mean(p2) = 3.003012417905884
mean(chn["param[1][2]"]) = 3.0387702707202373
3.0387702707202373
```
![Samples generated using GibbsSampler.jl](assets/chn1.png)

### Comparison 2

The only difference from the **Example1** is the introduction likelihood function to calculate the joint probability.
This example compares the mean of samples generated using **Gibbs** and **AdvancedMH** sampling methods. Only one group of parameter is considered with two parameters. The proposal distribution of parameters are chosen as `MvNormal(zeros(2),1.0)`, `Normal(0.0,1.0)` respectively and sampled uisng `MH` sampler and sampled independently.

```julia
#Define the prior and proposal distribution
priors = [Normal(1.0,5.0), Normal(0.0,5.0)]

#Define the model
model(z) = z[1] + z[2]
output = 5.0
sample_alg = Dict(
:n_grp => 1,
1 => Dict(
:type => :ind,
:n_vars => 2,
1 => Dict(
:proposal => MvNormal(zeros(2),1.0),
:n_eles => 2,
:alg => 1
),
2 => Dict(
:proposal => Normal(0.0,1.0),
:n_eles => 1,
:alg => 1
)
)
)
prior = [MvNormal([2.0,3.0],1.0), Normal(4.0, 1.0)]
logJoint(params) = sum(logpdf.(prior, params))
#sample using gibbs sampler
chn = gibbs(alg, sample_alg, logJoint, itr = 10000, chain_type = :mcmcchain)

proposalmh = [MvNormal(zeros(2),1.0), Normal(0.0,1.0)]
model = DensityModel(logJoint)
spl = RWMH(proposalmh)
chain = sample(model, spl, 10000; chain_type=Vector{NamedTuple})
p1 = [chain[i][:param_1][1] for i in 1:length(chain)]
@show mean(p1) mean(chn["param[1][1]"])

p2 = [chain[i][:param_1][2] for i in 1:length(chain)]
@show mean(p2) mean(chn["param[1][2]"])

p3 = [chain[i][:param_2] for i in 1:length(chain)]
@show mean(p3) mean(chn["param[2][1]"])
```

#Function to calculate log joint probability
function logJoint1(params)
logPrior= sum(logpdf.(priors, params))
logLikelihood = logpdf(Normal(model(params)), output)
return logPrior + logLikelihood
end
### Comparison 3

# Construct a DensityModel for advanced MH.
mdl1 = DensityModel(logJoint1)
This example compares the mean of samples generated using **Gibbs** and **AdvancedMH** sampling methods. Only one group of parameter is considered with two parameters. The proposal distribution of parameters are chosen as `MvNormal(zeros(2),1.0)`, `Normal(0.0,1.0)` respectively and sampled uisng `MH` sampler and sampled together.

# Set up our sampler with a joint multivariate Normal proposal for advanced MH.
spl1 = RWMH(MvNormal([1.0,0.0],5.0))
```julia

# Sample from the posterior using advanced MH.
chm = sample(mdl1, spl1, 100000; param_names=["μ", "σ"], chain_type=Chains)
sample_alg = Dict(
:n_grp => 1,
1 => Dict(
:type => :dep,
:n_vars => 2,
:alg => 2,
1 => Dict(
:proposal => MvNormal(zeros(2),1.0),
:n_eles => 2
),
2 => Dict(
:proposal => Normal(0.0,1.0),
:n_eles => 1
)
)
)
prior = [MvNormal([2.0,3.0],1.0), Normal(4.0, 1.0)]
logJoint(params) = sum(logpdf.(prior, params))
#sample using gibbs sampler
chn = gibbs(alg, sample_alg, logJoint, itr = 10000, chain_type = :mcmcchain)

proposalmh = [MvNormal(zeros(2),1.0), Normal(0.0,1.0)]
model = DensityModel(logJoint)
spl = RWMH(proposalmh)
chain = sample(model, spl, 10000; chain_type=Vector{NamedTuple})

#define MCMC sampling algorithm
alg = [MH()]
sample_alg =Dict(
1 => [1, 1, Normal(1.0,5.0)],
2 => [1, 1, Normal(0.0,5.0)]
)
p1 = [chain[i][:param_1][1] for i in 1:length(chain)]
@show mean(p1) mean(chn["param[1][1]"])

# Sample from the posterior using Gibbs sampler.
chn = GibbsSampler.gibbs(alg, sample_alg, logJoint;itr = 100000, chain_type = :mcmcchain)
p2 = [chain[i][:param_1][2] for i in 1:length(chain)]
@show mean(p2) mean(chn["param[1][2]"])

@show mean(chn["param[1][1]"]) mean(chm[])
p3 = [chain[i][:param_2] for i in 1:length(chain)]
@show mean(p3) mean(chn["param[2][1]"])
```

The results are as below:
### Comparison 4

This example compares the mean of samples generated using **Gibbs** and **AdvancedMH** sampling methods. Only two groups is considered with one parameter in each group. The proposal distribution of parameters are chosen as `MvNormal(zeros(2),1.0)`, `MvNormal(zeros(3),1.0)` respectively and sampled uisng `MH` sampler and sampled together.

```julia
mean(chn["param[1][1]"]) = 3.018838279341398
mean(chm[]) = 3.9580733435017885
sample_alg = Dict(
:n_grp => 2,
1 => Dict(
:type => :ind,
:n_vars => 1,
1 => Dict(
:proposal => MvNormal(zeros(2),1.0),
:n_eles => 2,
:alg => 1
)
),
2 => Dict(
:type => :ind,
:n_vars => 1,
1 => Dict(
:proposal => MvNormal(zeros(3),1.0),
:n_eles => 3,
:alg => 1
)
)
)
prior = [MvNormal([2.0,3.0],1.0), MvNormal([4.0,3.0, 5.0],1.0)]
logJoint(params) = sum(logpdf.(prior, params))
#sample using gibbs sampler
chn = gibbs(alg, sample_alg, logJoint, itr = 10000, chain_type = :mcmcchain)

proposalmh = [MvNormal(zeros(2),1.0), MvNormal(zeros(3),1.0)]
model = DensityModel(logJoint)
spl = RWMH(proposalmh)
chain = sample(model, spl, 10000; chain_type=Vector{NamedTuple})

p1 = [chain[i][:param_1][1] for i in 1:length(chain)]
@show mean(p1) mean(chn["param[1][1]"])

p2 = [chain[i][:param_1][2] for i in 1:length(chain)]
@show mean(p2) mean(chn["param[1][2]"])

p3 = [chain[i][:param_2][1] for i in 1:length(chain)]
@show mean(p3) mean(chn["param[2][1]"])

p4 = [chain[i][:param_2][2] for i in 1:length(chain)]
@show mean(p4) mean(chn["param[2][2]"])

p5 = [chain[i][:param_2][3] for i in 1:length(chain)]
@show mean(p5) mean(chn["param[2][3]"])
```
95 changes: 41 additions & 54 deletions docs/src/examples.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,67 +8,54 @@ The [GibbsSampler.jl](https://github.com/efmanu/GibbsSampler.jl) allows the use
### Use of AdvancedMH as MCMC sampler
The `MH()` struct defined with [GibbsSampler.jl](https://github.com/efmanu/GibbsSampler.jl) package is used to select MCMC sampler for each parameter in Gibbs sampling.


#### Example
```julia
#use packages
using GibbsSampler
using Distributions

#define prior and proposal distributions
priors = [Normal(2.0,3.0), Normal(3.0,3.0)]

#log of joint probability
function logJoint(params)
logPrior= sum(logpdf.(priors, params))
return logPrior
end
alg = [MH()]
sample_alg =Dict(
1 => [1, 1, Normal(2.0,3.0)],
2 => [1, 1, Normal(3.0,3.0)]
)
# Sample from the posterior using Gibbs sampler.
chn = GibbsSampler.gibbs(alg, sample_alg, logJoint;itr = 10000, chain_type = :mcmcchain)
```
### Use of AdvancedHMC as MCMC sampler
The `adHMC()` struct defined with [GibbsSampler.jl](https://github.com/efmanu/GibbsSampler.jl) package is used to select MCMC sampler for each parameter in Gibbs sampling.

```julia
#select MCMC sampler as vector with adHMC() struct with same length of proposal distribution
alg = [adHMC()]
sample_alg =Dict(
1 => [1],
2 => [1]
#define MCMC samplers
alg = [MH(), adHMC()]

#define sample_alg parameter
sample_alg = Dict(
:n_grp => 2,
1 => Dict(
:type => :ind,
:n_vars => 2,
1 => Dict(
:proposal => MvNormal(zeros(2),1.0),
:n_eles => 2,
:alg => 1
),
2 => Dict(
:proposal => Normal(0.0,1.0),
:n_eles => 1,
:alg => 1
)
),
2 => Dict(
:type => :dep,
:n_vars => 2,
:alg => 2,
1 => Dict(
:proposal => MvNormal(zeros(3),1.0),
:n_eles => 3
),
2 => Dict(
:proposal => Normal(0.0,1.0),
:n_eles => 1
)
)
)
#define prior distribution
prior = [MvNormal([1.0,2.0],1.0),Normal(2.0,1.0), MvNormal([2.0,4.0,3.0],1.0),Normal(-1.0,1.0)]

# Sample from the posterior using Gibbs sampler.
chn = GibbsSampler.gibbs(alg, sample_alg, logJoint;itr = 10000, chain_type = :mcmcchain)
```

### Use of AdvancedHMC as MCMC sampler
The `adNUTS()` struct defined with [GibbsSampler.jl](https://github.com/efmanu/GibbsSampler.jl) package is used to select MCMC sampler for each parameter in Gibbs sampling.
#define logjoint function
logJoint(params) = sum(logpdf.(prior, params))

```julia
#select MCMC sampler as vector with adNUTS() struct with same length of proposal distribution
alg = [adNUTS()]
sample_alg =Dict(
1 => [1,1],
2 => [1,1]
)

# Sample from the posterior using Gibbs sampler.
chn = GibbsSampler.gibbs(alg, sample_alg, logJoint;itr = 10000, chain_type = :mcmcchain)
#sample
chn = gibbs(alg, sample_alg, logJoint, itr = 10000, chain_type = :mcmcchain)
```

### Use of different MCMC sampler for each parameter

```julia
#select MCMC sampler as vector the same length of proposal distribution
alg = [adNUTS(), MH()]
sample_alg =Dict(
1 => [1,1],
2 => [2, 1, Normal(0.0,1.0)]
)

# Sample from the posterior using Gibbs sampler.
chn = GibbsSampler.gibbs(alg, sample_alg, logJoint;itr = 10000, chain_type = :mcmcchain)
```
Loading

0 comments on commit f5a2a3c

Please sign in to comment.