Conditional Distributions in the GLM

Conditional Distributions · GLM Families · Density/PMF · E[Y|x] · Ridgeline · Reflection Questions

© Dr. Rainer Düsing · Interactive Tools by Claude

⬡ GLM IN 3D
SCENARIOS Choose a research scenario — or explore parameters freely
VIEW 1 — DISTRIBUTION
P(Y | x = 0)
η =  →  λ =
VIEW 2 — REGRESSION
E[Y|x] as a function of x
xη = β₀ + β₁·xParameterE[Y|x]
ℹ GLM Conditional Distributions — Help
What will I learn here?
The core principle of the GLM: for every x-value a distinct conditional distribution P(Y|x) arises. The linear predictor η = β₀ + β₁·x is transformed via the link function into the distribution parameter — which then determines the shape and location of the distribution.
The linear predictor η
η = β₀ + β₁·x can take any value. The link function transforms it into the valid parameter range:

Log link: λ = e^η — always positive (Poisson, Gamma)
Logit link: p = 1/(1+e^−η) — always in (0,1) (Bernoulli, Binomial)
Identity: μ = η — the classical LM (Normal)
The three views
GLM families overview
Normal (OLS): continuous data, unbounded · σ constant
Poisson / Neg. Binomial: count data (0,1,2,…); NB for overdispersion
Bernoulli / Binomial: 0/1-data or k successes from n trials
Gamma: positive continuous data, right-skewed
ZIP / Hurdle-Poisson: count data with many zeros — two processes combined
Reflection questions
Reflection questions appear at the bottom to deepen understanding — e.g. why variance and mean are identical for Poisson or when to choose Neg. Binomial instead of Poisson.
Next → GLM in 3D: the same distributions visualised as a three-dimensional landscape