Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Quasi-Newton Methods
Quasi-Newton Methods
2 questions
Difficulty 5-6
View topic
Intermediate
0 / 2
2 intermediate
Adapts to your performance
1 / 2
intermediate (5/10)
compare
Quasi-Newton methods like BFGS approximate the Hessian using only gradient information. What practical advantage does BFGS have over exact Newton on large-scale problems?
Hide and think first
A.
BFGS avoids computing the Hessian directly and maintains an approximation using low-cost rank-two updates
B.
BFGS solves a different optimization problem than Newton, converging to a regularized minimizer instead
C.
BFGS converges faster than Newton on general convex problems, achieving cubic instead of quadratic rates
D.
BFGS is provably globally convergent without line search, unlike Newton's method which requires damping
Submit Answer