Line: Block-Specific Sampling with AMWG and Slice¶
An example based on the linear regression model defined in the Tutorial section. The program below illustrates use of the stand-alone amwg!()
and slice!()
functions to sample different parameter blocks within the same MCMC algorithm.
Analysis Program¶
################################################################################
## Linear Regression
## y ~ N(b0 + b1 * x, s2)
## b0, b1 ~ N(0, 1000)
## s2 ~ invgamma(0.001, 0.001)
################################################################################
using Mamba
## Data
data = Dict{Symbol, Any}(
:x => [1, 2, 3, 4, 5],
:y => [1, 3, 3, 3, 5]
)
## Log-transformed unnormalized joint posterior for b0, b1, and log(s2)
logf = function(x::DenseVector)
b0 = x[1]
b1 = x[2]
logs2 = x[3]
(-0.5 * length(data[:y]) - 0.001) * logs2 -
(0.5 * sum(abs2, data[:y] .- b0 .- b1 .* data[:x]) + 0.001) / exp(logs2) -
0.5 * b0^2 / 1000 - 0.5 * b1^2 / 1000
end
## Log-transformed unnormalized full conditional densities for the model
## parameters beta and log(s2) defined below in the MCMC simulation
logf_beta(x) = logf([x; logs2])
logf_logs2(x) = logf([beta; x])
## MCMC simulation
n = 10000
burnin = 1000
sim = Chains(n, 3, names = ["b0", "b1", "s2"])
beta = AMWGVariate([0.0, 0.0], 1.0, logf_beta)
logs2 = SliceMultivariate([0.0], 5.0, logf_logs2)
for i in 1:n
sample!(beta, adapt = (i <= burnin))
sample!(logs2)
sim[i, :, 1] = [beta; exp.(logs2)]
end
describe(sim)
Results¶
Iterations = 1:10000
Thinning interval = 1
Chains = 1
Samples per chain = 10000
Empirical Posterior Estimates:
Mean SD Naive SE MCSE ESS
b0 0.64401798 0.99315634 0.0099315634 0.060725564 267.4805
b1 0.78985612 0.29790444 0.0029790444 0.017888106 277.3481
s2 1.20785292 2.96033511 0.0296033511 0.062566344 2238.7222
Quantiles:
2.5% 25.0% 50.0% 75.0% 97.5%
b0 -1.33127385 0.075527035 0.6403226 1.1902679 2.7665517
b1 0.16055439 0.625740352 0.7923275 0.9527029 1.3903861
s2 0.16673189 0.381185645 0.6538295 1.2373814 5.6065938