Unlock the mysteries behind the models powering today’s most advanced AI applications. In this course, instructor Axel Sirota takes you beyond just using large language models (LLMs) like BERT or GPT and highlights the mathematical foundations of generative AI. Explore the challenge of sentiment analysis with simple recurrent neural networks (RNNs) and progressively evolve your approach as you gain a deep understanding of attention mechanisms, transformers, and models. Through intuitive explanations and hands-on coding exercises, Axel outlines why attention revolutionized natural language processing, and how transformers reshaped the field by eliminating the need for RNNs altogether. Along the way, get tips on fine-tuning pretrained models, applying cutting-edge techniques like low-rank adaptation (LoRA), and leveraging your newly acquired skills to build smarter, more efficient models and innovate in the fast-evolving world of AI.
Learn More