# Line: Approximate Bayesian Computation¶

A simple example to demonstrate the Approximate Bayesian Computation (ABC) sampler within the MCMC framework, based on the linear regression model defined in the Tutorial section. ABC sampling is applied separately to the `:beta` and `:s2` parameter blocks. Different summary statistics are specified to show a range of functions that could be used. More common practice is to use the same data summaries for all ABC-sampled parameters.

## Analysis Program¶

```using Mamba

## Data
line = Dict{Symbol, Any}(
:x => [1, 2, 3, 4, 5],
:y => [1, 3, 3, 3, 5]
)

line[:xmat] = [ones(5) line[:x]]

## Model Specification
model = Model(
y = Stochastic(1,
(xmat, beta, s2) -> MvNormal(xmat * beta, sqrt(s2)),
false
),
beta = Stochastic(1, () -> MvNormal(2, sqrt(100))),
s2 = Stochastic(() -> InverseGamma(0.01, 0.01))
)

## Initial Values
inits = [
Dict{Symbol, Any}(
:y => line[:y],
:beta => rand(Normal(0, 1), 2),
:s2 => rand(Gamma(1, 1))
)
for i in 1:3
]

## Tuning Parameters
scale1 = [0.5, 0.25]
summary1 = identity
eps1 = 0.5

scale2 = 0.5
summary2 = x -> [mean(x); sqrt(var(x))]
eps2 = 0.1

## User-Defined Sampling Scheme
scheme = [
ABC(:beta, scale1, summary1, eps1, kernel=Normal, maxdraw=100, nsim=3),
ABC(:s2,   scale2, summary2, eps2, kernel=Epanechnikov, maxdraw=100, nsim=3)
]
setsamplers!(model, scheme)

## MCMC Simulation with Approximate Bayesian Computation
sim = mcmc(model, line, inits, 10000, burnin=1000, chains=3)
describe(sim)
```

## Results¶

```Iterations = 1001:10000
Thinning interval = 1
Chains = 1,2,3
Samples per chain = 9000

Empirical Posterior Estimates:
Mean        SD       Naive SE       MCSE       ESS
s2 1.30743333 1.99877929 0.0121641834 0.083739029 569.73624
beta 0.72349922 1.03842764 0.0063196694 0.039413390 694.16848
beta 0.77469344 0.31702542 0.0019293553 0.011392989 774.30630

Quantiles:
2.5%        25.0%      50.0%      75.0%     97.5%
s2  0.048095084 0.23351203 0.57947788 1.45858829 7.7639321
beta -1.309713807 0.12616636 0.67263204 1.27579373 3.1735176
beta  0.107216316 0.59367961 0.77867235 0.95156086 1.4043715
```