Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Transfer Learning
Transfer Learning
3 questions
Difficulty 3-5
View topic
Foundation
0 / 3
1 foundation
2 intermediate
Adapts to your performance
1 / 3
foundation (3/10)
conceptual
Transfer learning typically freezes earlier layers of a pretrained model and fine-tunes later ones. What is the rationale?
Hide and think first
A.
Later layers must be re-trained from random initialization for any new task to avoid catastrophic forgetting
B.
Earlier layers learn generic features (edges, textures) that transfer; later layers learn task-specific features that need adaptation
C.
Earlier layers have more parameters and are more expensive to fine-tune, so freezing them saves memory
D.
Earlier layers produce non-differentiable outputs, making backpropagation through them numerically unstable
Submit Answer