Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Stochastic Gradient Descent Convergence
Stochastic Gradient Descent Convergence
13 questions
Difficulty 1-7
View topic
Foundation
0 / 13
4 foundation
7 intermediate
2 advanced
Adapts to your performance
1 / 13
foundation (1/10)
conceptual
In gradient descent for minimization, the update rule moves parameters in which direction?
Hide and think first
A.
A random direction, chosen independently at each iteration for exploration
B.
The direction of the Hessian's largest eigenvector, following the steepest curvature
C.
The direction of the negative gradient, so the loss decreases at each step
D.
The direction of the positive gradient, since gradients always point toward minima
Submit Answer