『GPT-3 : Language Models are Few-Shot Learners』のカバーアート

GPT-3 : Language Models are Few-Shot Learners

GPT-3 : Language Models are Few-Shot Learners

無料で聴く

ポッドキャストの詳細を見る

このコンテンツについて

In this third episode of AI Papers Explained, we explore GPT-3: Language Models are Few-Shot Learners, the landmark paper by OpenAI (2020).
Discover how scaling up model size and training data led to new emergent capabilities and marked the beginning of the large language model era.
We connect this milestone to the foundations laid by Attention Is All You Need and BERT, showing how GPT-3 transformed research into the age of general-purpose AI.

🎙️ Source: Brown et al., OpenAI, 2020 — Language Models are Few-Shot Learners (arXiv:2005.14165)

まだレビューはありません