Master the attention revolution. Understand the transformer architecture that powers GPT, BERT, and every modern language model.
Basic probability concepts
Python fundamentals; string manipulation
Deep Learning Using Transformers
IntermediateAdvanced Vision Applications with DL & Transformers
AdvancedAttention Mechanisms and Transformer Models Course
BeginnerSentiment Analysis with Deep Learning using BERT
IntermediateTransformers for Computer Vision Applications
IntermediateTransformers and Self-Attention (DL 19)
IntermediateVision Transformer Quick Guide - Theory and Code in (almost) 15 min
AdvancedLLMs Mastery: Complete Guide to Transformers & Generative AI
BeginnerFine Tuning LLM with Hugging Face Transformers for NLP
BeginnerDeep Learning: Natural Language Processing with Transformers
BeginnerMaster LLM: Large Language Models with Transformers
BeginnerData Science: Transformers for Natural Language Processing
IntermediateDeep Learning Using Transformers
IntermediateAdvanced Vision Applications with DL & Transformers
AdvancedAttention Mechanisms and Transformer Models Course
BeginnerSentiment Analysis with Deep Learning using BERT
IntermediateTransformers for Computer Vision Applications
IntermediateTransformers and Self-Attention (DL 19)
IntermediateVision Transformer Quick Guide - Theory and Code in (almost) 15 min
AdvancedLLMs Mastery: Complete Guide to Transformers & Generative AI
BeginnerFine Tuning LLM with Hugging Face Transformers for NLP
BeginnerDeep Learning: Natural Language Processing with Transformers
BeginnerMaster LLM: Large Language Models with Transformers
BeginnerData Science: Transformers for Natural Language Processing
IntermediateFollow these courses in order to complete the learning path. Click on any course to enroll.
This course explores the application of Transformers in video understanding, with a focus on action recognition and instance segmentation, and covers recent developments in large-scale pre-training and multimodal learning.
Build systems and applications using advanced Computer Vision and Deep Learning techniques. The course covers Vision Transformers, object detection with Detection Transformers (RTDETR), and fine-tuning ViT models.
This course provides a comprehensive introduction to attention mechanisms and the transformer models that are foundational to modern GenAI systems. It covers self-attention, multi-head attention, and the overall transformer architecture, with real-world demos.
A guided project on Coursera that focuses on using the powerful BERT model for sentiment analysis tasks.
A comprehensive course on vision transformers and their use cases in computer vision. You'll explore the rise of transformers and attention mechanisms and gain insights into self-attention, multi-head attention, and the pros and cons of transformers.
Transformers and Self-Attention (DL 19)
Vision Transformer Quick Guide - Theory and Code in (almost) 15 min
Welcome to "LL Ms Mastery: Complete Guide to Generative AI & Transformers"!This practical course is designed to equip you with the knowledge and skills to build efficient, production-ready Large Language Models using cutting-edge technologies.Key Topics Covered:Generative AI: Understand the principles and applications of Generative AI in creating new data instances.ChatGPT & GPT4: Dive into the workings of advanced AI models like ChatGPT and GPT4.LL Ms: Start with the basics of LL Ms, learning how they decode, process inputs and outputs, and how they are taught to communicate effectively.Encoder-Decoders: Master the concept of encoder-decoder models in the context of Transformers.T5, GPT2, BERT: Get hands-on experience with popular Transformer models such as T5, GPT2, and BERT.Machine Learning & Data: Understand the role of machine learning and data in training robust AI models.Advanced Techniques: Sophisticated training strategies like PeFT, Lo Ra, managing data memory and merging adapters.Specialised Skills: Cutting-edge training techniques, including 8-bit, 4-bit training and Flash-Attention.Scalable Solutions: Master the use of advanced tools like Deep Speed and FSDP to efficiently scale model training.Course Benefits:• Career Enhancement: Position yourself as a valuable asset in tech teams, capable of tackling significant AI challenges and projects.• <s
Do not take this course if you are an ML beginner. This course is designed for those who are interested in pure coding and want to fine-tune LL Ms instead of focusing on prompt engineering. Otherwise, you may find it difficult to understand.Welcome to "Mastering Transformer Models and LLM Fine Tuning", a comprehensive and practical course designed for all levels, from beginners to advanced practitioners in Natural Language Processing (NLP). This course delves deep into the world of Transformer models, fine-tuning techniques, and knowledge distillation, with a special focus on popular BERT variants like Phi2, LLAMA, T5, BERT, DistilBERT, MobileBERT, and TinyBERT.Course Overview:Section 1: Introduction Get an overview of the course and understand the learning outcomes.Introduction to the resources and code files you will need throughout the course.Section 2: Understanding Transformers with Hugging Face Learn the fundamentals of Hugging Face Transformers.Explore Hugging Face pipelines, checkpoints, models, and datasets.Gain insights into Hugging Face Spaces and Auto-Classes for seamless model management.Section 3: Core Concepts of Transformers and LL Ms Delve into the architectures and key concepts behind Transformers.Understand the applications of Transformers in various NLP tasks.Introduction to transfer learning with Transformers.Section 4: BERT Architecture Deep Dive Detailed exploration of BERT's architecture and its importance in context understanding.Learn about Masked Language Modeling (MLM) and Next Sentence Prediction (NSP) in BERT.Understand BERT fine-tuning and evaluation techniques.Section 5: Practical Fine-Tuning with BERT</strong
Deep Learning is a hot topic today! This is because of the impact it's having in several industries. One of the fields in which deep learning has the most influence today is Natural Language Processing.To understand why Deep Learning based Natural Language Processing is so popular; it suffices to take a look at the different domains where giving a computer the power to understand and make sense out of text and generate text has changed our lives.Some applications of Natural Language Processing are in:Helping people around the world learn about any topic ChatGPT Helping developers code more efficiently with Github Copilot.Automatic topic recommendation in our Twitter feeds Automatic Neural Machine Translation with Google TranslateE-commerce search engines like those of Amazon Correction of Grammar with Grammarly The demand for Natural Language Processing engineers is skyrocketing and experts in this field are highly paid, because of their value. However, getting started in this field isn’t easy. There’s so much information out there, much of which is outdated and many times don't take the beginners into consideration :(In this course, we shall take you on an amazing journey in which you'll master different concepts with a step-by-step and project-based approach. You shall be using TensorFlow 2 (the world's most popular library for deep learning, built by Google) and Hugging Face transformers (most popular NLP focused library ). We shall start by understanding how to build very simple models (like Linear regression model for car price prediction and RNNs text classifiers for movie revi
Unlock the power of modern Natural Language Processing (NLP) and elevate your skills with this comprehensive course on NLP with a focus on Transformers. This course will guide you through the essentials of Transformer models, from understanding the attention mechanism to leveraging pre-trained models. If so, then this course is for you what you need! We have divided this course into Chapters. In each chapter, you will be learning a new concept for Natural Language Processing with Transformers. These are some of the topics that we will be covering in this course:Starting from an introduction to NLP and setting up your Python environment, you'll gain hands-on experience with text preprocessing methods, including tokenization, stemming, lemmatization, and handling special characters. You will learn how to represent text data effectively through Bag of Words, n-grams, and TF-IDF, and explore the groundbreaking Word2Vec model with practical coding exercises.Dive deep into the workings of transformers, including self-attention, multi-head attention, and the role of position encoding. Understand the architecture of transformer encoders and decoders and learn how to train and use these powerful models for real-world applications.The course features projects using state-of-the-art pre-trained models from Hugging Face, such as BERT for sentiment analysis and T5 for text translation. With guided coding exercises and step-by-step project walkthroughs, you’ll solidify your understanding and build your confidence in applying these models to complex NLP tasks.By the end of this course, you’ll be equipped with practical skills to tackle NLP challenges, build robust solutions, and a
Ever wondered how AI technologies like OpenAI ChatGPT, GPT-4, Gemini Pro, Llama 3, DALL-E, Midjourney, and Stable Diffusion really work? In this course, you will learn the foundations of these groundbreaking applications.Hello friends!Welcome to Data Science: Transformers for Natural Language Processing.Ever since Transformers arrived on the scene, deep learning hasn't been the same.Machine learning is able to generate text essentially indistinguishable from that created by humans We've reached new state-of-the-art performance in many NLP tasks, such as machine translation, question-answering, entailment, named entity recognition, and more We've created multi-modal (text and image) models that can generate amazing art using only a text prompt We've solved a longstanding problem in molecular biology known as "protein structure prediction"In this course, you will learn very practical skills for applying transformers, and if you want, detailed theory behind how transformers and attention work.This is different from most other resources, which only cover the former.The course is split into 3 major parts:Using Transformers Fine-Tuning Transformers Transformers In-DepthPART 1: Using Transformers In this section, you will learn how to use transformers which were trained for you. This costs millions of dollars to do, so it's not something you want to try by yourself!We'll see how these prebuilt models can already be used for a wide array of tasks, including:text classification (e.g. spam detection, sentiment analysis, document categorization)named entity recognitiontext summarizationmachine transla
Explore related content to expand your skills beyond this learning path.
Enroll in this path to track your progress and stay motivated.