エピソード

  • Practice Exam Review
    2025/04/26

    Practice Exam Review

    続きを読む 一部表示
    1 時間
  • Final Exam Review
    2025/04/17
    1 時間 2 分
  • Logical Representations of Sentence Meaning, Semantic Role Labeling & Information Extraction
    2025/04/03

    In this module, we'll continue our exploration of linguistic analysis of sentences rather than focusing on the structure of sentences like we did on the parsing module. To do so, we'll cover logical representations of sentence meaning, semantic role labeling and information extraction.

    続きを読む 一部表示
    23 分
  • Parsing and Dependency Parsing
    2025/03/31

    In this module, we'll delve more into the linguistics side of natural language processing. We'll take a look at different approaches to parsing and learn about the structure of sentences from a linguistics perspective.

    続きを読む 一部表示
    30 分
  • Dialogue Systems, Chatbots & Question Answering
    2025/03/31

    In this module, we will delve into two related NLP topics: dialogue systems and question answering.

    続きを読む 一部表示
    28 分
  • Machine Translation
    2025/03/31

    In this module, we will go over machine translation, one of the most important NLP applications, the challenges it involves, and how to evaluate machine translation models.

    続きを読む 一部表示
    29 分
  • Prompt Engineering, Instruction Following, and Using GPT
    2025/03/30

    In this module, we will delve into some of the capabilities of cutting edge pre-trained language models. We will explore the vital concepts of prompt engineering and instruction following. We'll first discuss the pre-train, prompt and predict paradigm, a new approach which questions whether fine tuning is even necessary. After. we'll cover several advanced prompting techniques such as few shot learning, instruction following and chain of thought prompting. These advanced techniques will provide us with a more efficient and more creative ways to harness the full potential of pre-trained language models.

    続きを読む 一部表示
    17 分
  • Encoder-Decoders, BERT and Fine-tuning
    2025/03/17

    In this module, we will cover encoder-decoder models, BERT, fine-tuning and masked language models. Understanding them will give you a good understanding of state-of-the-art NLP models, and why pre-trained large language models have become so important.

    続きを読む 一部表示
    22 分