Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Attention Mechanisms History
Attention Mechanisms History
3 questions
Difficulty 2-4
View topic
Foundation
0 / 3
2 foundation
1 intermediate
Adapts to your performance
1 / 3
foundation (2/10)
state theorem
Attention was introduced in neural machine translation by Bahdanau et al. (2014). What problem did it solve in the sequence-to-sequence architecture?
Hide and think first
A.
The high computational cost of training large vocabulary softmax outputs
B.
The difficulty of handling variable-length inputs in fully-connected neural networks
C.
The vanishing-gradient problem in stacked RNN layers during backpropagation
D.
The bottleneck of compressing an entire source sentence into a single fixed-size encoder state
Submit Answer