Beta. Content is under active construction and has not been peer-reviewed. Report errors on GitHub.Disclaimer

Attention Mechanisms History

3 questionsDifficulty 2-4View topic
Foundation
0 / 3
2 foundation1 intermediateAdapts to your performance
1 / 3
foundation (2/10)state theorem
Attention was introduced in neural machine translation by Bahdanau et al. (2014). What problem did it solve in the sequence-to-sequence architecture?