Run a focused checkpoint.
Answer what you know, skip what you do not, and let the run turn weak signals into a short review path.
Scope
Run at a glance
Starts with lower-friction anchors, then checks the concentration-to-learning-theory bridge.
Focused checkpoint
Ready for the checkpoint run?
Answer what you know and skip what you do not. Misses adjust the remaining order, and the result becomes a short review path for this checkpoint.
Linear algebra foundations
A curated 10-question diagnostic spine for the LA-for-ML basics: algebraic identities, transpose and inverse rules, trace, determinant, linear independence, the col-space/null-space picture of Ax=b, the spectral theorem, and rank-nullity.
Learner can recognise standard matrix identities, compute small determinants and 2x2 eigenvalues, reason about Ax=b solvability via column space, distinguish 0/1/infinite-solution regimes, apply the spectral theorem to symmetric matrices, and use rank-nullity to bound dimensions.
How this run adapts
Review focus
- matrix algebra reverse-order rules
- determinant and trace properties
- column-space and null-space picture of Ax=b
- spectral theorem and rank-nullity
Could unlock
- Matrix Shape Debugger
- Linear System Solvability Checklist
Optional self-check
Leave blank if you want the neutral ramp. Pick only the areas where you have a strong signal.
Vectors, matrices, derivatives, notation.
Distributions, estimators, likelihood, concentration.
Gradients, convexity, SGD, Adam, proximal ideas.
ERM, VC, PAC, Rademacher, generalization.
Deep nets, attention, transformers, value functions.
Moving ideas across topics and checking assumptions.
This focused run uses the gold questions attached to the checkpoint. Sign in to attach it to your profile.
Browser-only run
This diagnostic will save for this browser, but it will not create account-level learning events. Sign in first if you want the run to count toward your profile.