Skip to main content
← Choose a different target

Unlock: Bayesian Linear Regression

Gaussian prior, Gaussian likelihood, Gaussian posterior. Full posterior derivation by completing the square in the exponent: the posterior mean equals the ridge estimator, the predictive distribution has irreducible plus epistemic variance, and the marginal likelihood gives a closed-form hyperparameter selection criterion. Worked numeric example with three data points carries the algebra end to end.

109 Prerequisites0 Mastered0 Working96 Gaps
Prerequisite mastery12%
Recommended probe

Subgradients and Subdifferentials is your weakest prerequisite with available questions. You haven't been assessed on this topic yet.

Not assessed5 questions
Conjugate PriorsInfrastructure
No quiz
Not assessed15 questions
Ridge RegressionFoundations
Not assessed8 questions
Bayesian EstimationInfrastructure
Not assessed12 questions

Sign in to track your mastery and see personalized gap analysis.