『Transformers : Attention Is All You Need (The Birth of Transformers)』のカバーアート

Transformers : Attention Is All You Need (The Birth of Transformers)

Transformers : Attention Is All You Need (The Birth of Transformers)

無料で聴く

ポッドキャストの詳細を見る

このコンテンツについて

In this first episode of AI Papers Explained, we explore one of the most influential research papers in the history of deep learning: Attention Is All You Need (Vaswani et al., 2017).

You’ll learn why the Transformer architecture replaced RNNs and LSTMs, how self-attention works, and how this paper paved the way for models like BERT, GPT, and T5.

🎙️ Hosted by Anass El Basraoui, a Data Scientist and AI researcher.

Topics covered:

  • Scaled dot-product attention
  • Multi-head attention• Encoder–decoder structure
  • Positional encoding
  • The legacy of the Transformer
まだレビューはありません