Bayesian Regression — Interactive

Prior · Likelihood · Posterior · Kruschke Diagram · Prior Predictive · MCMC Sampler

© Dr. Rainer Düsing · Interactive Tools by Claude

🎓 Sleep Duration → Perceived Stress (PSS, z-score) · Step 1 of 7
MODEL STRUCTURE
Kruschke Diagram
Arrows = stochastic dependence ↓  ·  Hyperpriors   Priors   Likelihood   Data
PRIOR PREDICTIVE CHECK
What does the model say before the data? (McElreath approach)
60 Prior Lines
Each line = α~N(μ_α,σ_α), β~N(μ_β,σ_β).
Large σ_α or σ_β → many plausible worlds.
PRIOR → POSTERIOR UPDATE
How the data update the prior
Prior Pred. Post. Pred. Post. Median True Line
Intercept α Prior Post.
Slope β Prior Post.
Residual SD σ Prior Post.
P(θ|y) ∝ P(y|θ)·P(θ)
Narrow posterior curve = more certainty.
Prior curve shifts with a tight prior.
MCMC — METROPOLIS-HASTINGS
How the sampler explores the posterior space
Heatmap = log P(β,σ|y)  ·  ● accept.   ● reject.   ✕ true   ● current
Iter: 0 Acceptance:
Trace Plot β  — true
Histogram β  (post burn-in)
Trace Plot σ  — true
Histogram σ  (post burn-in)
Metropolis Step:
1. Propose β*~N(β,0.15²), log σ*~N(log σ,0.12²)
2. r = P(β*,σ*|y) / P(β,σ|y)
3. Accept if U(0,1) < min(1,r)
🎓 Tutorial — Bayesian Regression
The Example
You are analysing data from n = 30 psychology students. Predictor x: average sleep duration (z-scored). Outcome y: perceived stress (PSS — Perceived Stress Scale, Cohen et al. 1983, z-scored).

Hypothesis: more sleep → less stress (β < 0). The true effect is β = −0.8.
What you will learn
In 7 guided steps you experience the complete Bayes cycle:

① Setup — configure parameters, explore the dataset
Prior Predictive Check — plausible slopes a priori
Prior → Posterior Update — Bayesian learning
Effect of sample size n
Outlier influence under Normal likelihood
Robustness via Student-t likelihood
MCMC sampler — joint posterior β × σ
How it works
Each step tells you what to do and what to observe. The active panel is outlined in colour and the relevant step card is shown bottom-right.

Use ⚙ Apply Values to automatically set all sliders to the recommended values. You can also explore freely — the tutorial only provides guidance.
The Steps panel appears as a green box in the bottom-right corner — always visible, no scrolling needed.
ℹ Bayesian Regression — Help
What will I learn here?
The complete Bayes cycle for linear regression: Prior → Likelihood → Posterior. You control the true parameters and the prior assumptions — and immediately see how both shape the posterior.
The Model
α ~ N(μ_α, σ_α) · β ~ N(μ_β, σ_β) · σ ~ HalfNormal(s_σ)
y ~ N(α + β·x, σ) — the likelihood

Left sidebar: true parameters (simulate the data) · priors (your assumptions about α, β, σ)
The Five Panels
CI vs. Prediction Interval
95% CI Mean (green, narrow): Uncertainty about the location of the regression line — contains only parameter uncertainty.

95% PPI New Observation (dashed, wider): where will a new data point fall? Also includes residual scatter σ.

With small σ the CI and PPI are close — with large σ the PPI is much wider.
Gaussian vs. Outlier-robust
Gaussian: y ~ N(α + β·x, σ) — standard
Outliers: y ~ t(ν, α + β·x, σ) — heavier tails, more robust against individual extreme values
Next → Bayesian PP Check: Posterior Predictive Checks for model diagnostics