Attention
Transformers
- [June 2017] Paper : Attention is all you need
- [April 2018] Explainer blog post : The annotated transformer
- [June 2018] Explainer blog post : The illustrated transformer
Bert
- Bert explainer
- The illustrated Bert, elmo and co : How NLP cracked transfer learning
- Paper
- Multi-lingual BERT
XLNet
XLM - Enhancing BERT for cross lingual language model
- XLM explainer
- How is XLM different from multilingual bert
- Paper : Cross lingual language model pretraining
- Code : github link
XLM-R
Blogposts : https://amaarora.github.io/2020/02/18/annotatedGPT2.html
ReplyDeleteMade with ML : https://madewithml.com/collections/9875/transformers-tutorials/
ReplyDeletehttps://lilianweng.github.io/lil-log/2018/06/24/attention-attention.html - attention is all you need
ReplyDeleteThe illustrated transformer : http://jalammar.github.io/illustrated-transformer/
ReplyDelete