Vectors, Matrices, and Linear Maps
Know what the objects are and how a matrix acts on a vector.
Math Foundations
A focused route through the linear algebra that powers PCA, optimization, embeddings, attention, and neural networks.
Time
~14 hours
Core loop
object → shape → operation → geometric meaning → ML use
Topics
8 ordered topics
End state
You can read matrix-heavy ML pages without stopping at every symbol, and you know which algebraic object is doing the work.
Treat vectors as objects and matrices as transformations, not just tables of numbers.
Know what the objects are and how a matrix acts on a vector.
Multiply, transpose, invert when valid, and check dimensions before computing.
Explain directions, variance, rank, projections, and low-dimensional structure.
Measure vector and matrix size in ways that match ML stability arguments.
Identify invariant directions and why they matter for covariance and dynamics.
Decompose a matrix into rotations, scalings, and low-rank structure.
See how covariance, eigenvectors, and projection become a learning method.
Connect Jacobians, Hessians, and matrix derivatives to optimization and backprop.
Track how vector-valued functions change with respect to vector inputs.
Use gradients and matrix derivatives without losing shape discipline.
Do not only read the pages. For each step, write the shape ledger, answer the practice prompt, and then run a small quiz or diagnostic. The goal is operational fluency: you should be able to predict what changes before code or algebra tells you.
Back to reading paths →