Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Newton's Method
Newton's Method
5 questions
Difficulty 4-5
View topic
Intermediate
0 / 5
5 intermediate
Adapts to your performance
1 / 5
intermediate (4/10)
conceptual
In optimization, the Hessian matrix
∇
2
f
(
x
)
encodes second-order information about
f
near
x
. Which interpretation is correct?
Hide and think first
A.
The Hessian stores the gradient of the loss with respect to inputs, used for input-space perturbations like adversarial examples
B.
The Hessian is always positive definite for convex functions, guaranteeing a unique global minimum in all directions
C.
The Hessian counts the number of parameters in the model, with its dimension equal to the number of trainable weights
D.
The Hessian encodes local curvature, with eigenvalues giving the rates of change of the gradient along principal directions
Submit Answer