ML Interview Prep
📚 PracticeMediumML ConceptKnowledge Ready

What is the attention mechanism in transformers?

ML Concept: 5-7 minutes to answer

nlptransformersdeep-learningmust-knowcategory:nlp_cv
Updated Jan 18, 2026

Question

Question

What is the attention mechanism in transformers?

Expected time to answer: 5-7 minutes

Difficulty: intermediate

Why This Is Asked: Tests understanding of the core mechanism behind modern NLP models (BERT, GPT, etc.)

Tags: nlp transformers deep-learning must-know


Your Solution

python
Auto-saves every 30s

Try solving the problem first before viewing the solution


0:00time spent