← Back to Home
GroundZeroAI
Transformers [ Week 1 ]
↓
Videos
• Problems with RNNs, Embeddings and Positional Embeddings
• Why do we need attention? Bahadanau Attention and Attention Layer
• Multi-head Attention, Layer Norm and Skip Connections
• Masked Attention and Feed Forward Networks
• Build the Transformer
Colab Notebooks
• Task 1
• Task 2
• Task 3
• Task 4
• Task 5
Quizzes
• Quiz 1
• Quiz 2
• Quiz 3
• Quiz 4
• Quiz 5
Level Up Challenge
• Click This