Tags
Language
Tags
October 2024
Su Mo Tu We Th Fr Sa
29 30 1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30 31 1 2

Introduction to Transformer Models for NLP [Released: 10/22/2024]

Posted By: IrGens
Introduction to Transformer Models for NLP [Released: 10/22/2024]

Introduction to Transformer Models for NLP [Released: 10/22/2024]
.MP4, AVC, 1280x720, 30 fps | English, AAC, 2 Ch | 11h 18m | 1.8 GB
Instructor: Sinan Ozdemir

This course provides a comprehensive overview of large language models (LLMs), transformers, and the mechanisms—attention, embedding, and tokenization—that set the stage for state-of-the-art NLP models like BERT and ChatGPT to flourish.

Instructor Sinan Ozdemir helps you develop a practical, comprehensive, and functional understanding of transformer architectures and how they are used to create modern NLP pipelines. Along the way, Sinan brings theory to life with detailed illustrations, mathematical equations, and concrete examples of Python in Jupyter notebooks.

Learning objectives

  • Recognize which type of transformer-based model is best for a given task.
  • Understand how transformers process text and make predictions.
  • Fine-tune transformer-based models with custom data.
  • Create actionable pipelines using fine-tuned models.
  • Deploy fine-tuned models and use them in production.
  • Leverage techniques in prompt engineering to optimize outputs from GPT-3 and ChatGPT.


Introduction to Transformer Models for NLP [Released: 10/22/2024]