- A Gentle Introduction to Positional Encoding In Transformer Models, Part 1
- All you need to know about ‘Attention’ and ‘Transformers’ — In-depth Understanding — Part 2
- A Survey of Transformers
- Syntactic Knowledge-Infused Transformer and BERT models
- Chain of Thought Prompting Elicits Reasoning in Large Language Models
- naab: A ready-to-use plug-and-play corpus for Farsi