Curated learning path for Machine Translation. Build practical skills through expert-selected courses.
Basic probability concepts
Python fundamentals; string manipulation
Follow these courses in order to complete the learning path. Click on any course to enroll.
This course focuses on the core concepts behind neural language models and machine translation, covering RNNs, attention, and transformers. Students learn to build, fine-tune, and evaluate neural models for language understanding and multilingual translation.
More and more evidence has demonstrated that graph representation learning especially graph neural networks (GN Ns) has tremendously facilitated computational tasks on graphs including both node-focused and graph-focused tasks. The revolutionary advances brought by GN Ns have also immensely contributed to the depth and breadth of the adoption of graph representation learning in real-world applications. For the classical application domains of graph representation learning such as recommender systems and social network analysis, GN Ns result in state-of-the-art performance and bring them into new frontiers. Meanwhile, new application domains of GN Ns have been continuously emerging such as combinational optimization, physics, and healthcare. These wide applications of GN Ns enable diverse contributions and perspectives from disparate disciplines and make this research field truly interdisciplinary.In this course, I will start by talking about basic graph data representation and concepts like node data, edge types, adjacency matrix and Laplacian matrix etc. Next, we will talk about broad kinds of graph learning tasks and discuss basic operations needed in a GNN: filtering and pooling. Further, we will discuss details of different types of graph filtering (i.e., neighborhood aggregation) methods. These include graph convolutional networks, graph attention networks, confidence GC Ns, Syntactic GC Ns and the general message passing neural network framework. Next, we will talk about three main types of graph pooling methods: Topology based pooling, Global pooling and Hierarchical pooling. Within each of these three types of graph pooling methods, we will discuss popular methods. For example, in topology pooling we will talk about Normalized Cut and Graclus mainly. In Global pooling, we will talk about Set2Set and Sort Pool. In Hierarchical pooling, we will talk about diff Pool, g Pool and SAG Pool. Next, we will talk about three unsupervised graph neural network architectures: Graph
Learn the theory of Seq2Seq in only 2 hours! A straight to the point course for those of you who don't have a lot of time.Embark on an academic adventure with our specialized online course, meticulously designed to illuminate the theoretical aspects of Seq2Seq (Sequence to Sequence) models within the realms of Deep Learning and Natural Language Processing (NLP).What This Course Offers:Exclusive Focus on Seq2Seq Model Theories: Our course curriculum is devoted to exploring the intricacies and theoretical foundations of Seq2Seq models. Delve into the principles and mechanics that make these models a cornerstone in NLP and Deep Learning.In-Depth Conceptual Insights: We take you through a comprehensive journey, dissecting the core concepts, architectures, and training of Seq2Seq models. Our focus is on fostering a deep understanding of these complex theories.Theory-Centric Approach: Emphasizing theoretical knowledge, this course intentionally steers away from practical coding exercises. Instead, we concentrate on building a robust conceptual framework around Seq2Seq models.Ideal for Theoretical Enthusiasts: This course is perfectly suited for students, educators, researchers, and anyone with a keen interest in the theoretical aspects of Deep Learning and NLP, specifically in the context of Seq2Seq models.Join us to master the theoretical nuances of Seq2Seq models in Deep Learning and NLP. Enroll now for an enlightening journey into the heart of these transformative technologies!And last but not least you will get a great series of Prizes providing extra case studies in Artificial Intelligence made by ChatGPT.Can't wait to see you inside the class,Kirill & Hadelin
Sentiment analysis and machine translation models are used by millions of people every single day. These deep learning models (most notably transformers) power different industries today. With the creation of much more efficient deep learning models, from the early 2010s, we have seen a great improvement in the state of the art in the domains of sentiment analysis and machine translation.In this course, we shall take you on an amazing journey in which you'll master different concepts with a step-by-step approach. We shall start by understanding how to process text in the context of natural language processing, then we would dive into building our own models and deploying them to the cloud while observing best practices. We are going to be using TensorFlow 2 (the world's most popular library for deep learning, built by Google) and Hugging Face You will learn:The Basics of TensorFlow (Tensors, Model building, training, and evaluation).Deep Learning algorithms like Recurrent Neural Networks, Attention Models, Transformers, and Convolutional neural networks.Sentiment analysis with RNNs, Transformers, and Hugging Face Transformers (Deberta)Transfer learning with Word2vec and modern Transformers (GPT, Bert, ULmfit, Deberta, T5...)Machine translation with RNNs, attention, transformers, and Hugging Face Transformers (T5)Model Deployment (Onnx format, Quantization, Fastapi, Heroku Cloud)If you are willing to move a step further in your career, this course is destined for you and we are super excited to help achieve your goals!This course is offered to you by Neuralearn. And just like every other course by Neuralearn, we lay much em
Explore related content to expand your skills beyond this learning path.
Enroll in this path to track your progress and stay motivated.