Beta. Content is under active construction and has not been peer-reviewed. Report errors on GitHub.Disclaimer

Transformers Pulse

Five questions across 7 topics. Two minutes.

1 / 5Transformers
1 / 5
foundation (2/10)state theorem
Attention was introduced in neural machine translation by Bahdanau et al. (2014). What problem did it solve in the sequence-to-sequence architecture?