Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Catastrophic Forgetting
Catastrophic Forgetting
3 questions
Difficulty 4-5
View topic
Intermediate
0 / 3
3 intermediate
Adapts to your performance
1 / 3
intermediate (4/10)
conceptual
Catastrophic forgetting occurs when a neural network loses prior knowledge while learning new tasks. What causes it mechanically?
Hide and think first
A.
Gradient updates for the new task overwrite weights important to the old task, because the network has no explicit protection for those weights
B.
Batch normalization running statistics get reset between tasks
C.
The network runs out of parameters as tasks accumulate, so new tasks can't fit
D.
Activation functions saturate on new tasks, freezing the weights
Submit Answer