Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Gradient Descent Variants
Gradient Descent Variants
5 questions
Difficulty 2-6
View topic
Foundation
0 / 5
2 foundation
3 intermediate
Adapts to your performance
1 / 5
foundation (2/10)
conceptual
What is gradient descent trying to do at each step?
Hide and think first
A.
Randomly perturb the parameters and keep the perturbation if the loss decreases
B.
Move the parameters in the direction that decreases the loss function most rapidly (the negative gradient direction)
C.
Temporarily increase the loss value in order to escape from local minima and find better solutions
D.
Find the exact global minimum of the loss function in a single optimization step using second-order information
Submit Answer