Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Automatic Differentiation
Automatic Differentiation
1 questions
Difficulty 6-6
View topic
Intermediate
0 / 1
1 intermediate
Adapts to your performance
1 / 1
intermediate (6/10)
conceptual
Automatic differentiation comes in two modes: forward mode and reverse mode. Deep learning frameworks use reverse mode (backpropagation). Why is reverse mode preferred for neural networks?
Hide and think first
A.
Reverse mode requires substantially less memory than forward mode because it avoids storing intermediate computation graph activations
B.
Reverse mode provides more numerically stable gradient estimates than forward mode due to reduced floating-point error accumulation
C.
Reverse mode computes the gradient of one scalar output with respect to all
n
parameters in one pass, while forward mode requires
O
(
n
)
passes
D.
Forward mode cannot handle nonlinear functions
Submit Answer