Unlock: Continuous-Time Gradient Flow (SLT View)
Gradient flow as the step-size-to-zero limit of gradient descent: an ODE on weight space. On least squares it converges to the minimum-norm OLS solution; early stopping is implicit ridge regularization; on overparameterized two-layer networks the mean-field limit yields the global-optimum convergence theorems of Mei-Montanari and Chizat-Bach.
310 Prerequisites0 Mastered0 Working237 Gaps
Prerequisite mastery24%
Recommended probe
Peano Axioms is your weakest prerequisite with available questions. You haven't been assessed on this topic yet.
Not assessed5 questions
No quiz
Not assessed4 questions
Not assessed27 questions
Not assessed4 questions
Ridge RegressionFoundations
Not assessed8 questions
Not assessed16 questions
Sign in to track your mastery and see personalized gap analysis.