What will I learn here?
The complete Bayes cycle for linear regression:
Prior → Likelihood → Posterior. You control the true
parameters
and the prior assumptions — and immediately see how both
shape the posterior.
The Model
α ~ N(μ_α, σ_α) · β ~ N(μ_β, σ_β) · σ ~ HalfNormal(s_σ)
y ~ N(α + β·x, σ) — the likelihood
Left sidebar:
true parameters (simulate the data) ·
priors (your assumptions about α, β, σ)
The Five Panels
- Kruschke Diagram — generative model structure: Hyperpriors → Priors → Likelihood → Data
- Prior Predictive — what do regression lines look like before we see data? (McElreath Ch. 4)
- Prior vs. Posterior — how do the data shift our beliefs about the regression line?
- Marginal Distributions — α, β, σ: Prior → Posterior in direct comparison
- MCMC Sampler — joint sampling of (β, σ) with Metropolis-Hastings; heatmap + trace plots
CI vs. Prediction Interval
95% CI Mean (green, narrow): Uncertainty about the
location of the regression line — contains only parameter uncertainty.
95% PPI New Observation (dashed, wider): where will
a new data point fall? Also includes residual scatter σ.
With small σ the CI and PPI are close — with large σ
the PPI is much wider.
Gaussian vs. Outlier-robust
Gaussian: y ~ N(α + β·x, σ) — standard
Outliers: y ~ t(ν, α + β·x, σ) — heavier tails,
more robust against individual extreme values
Next → Bayesian PP Check:
Posterior Predictive Checks for model diagnostics