New

Generative AI With NVIDIA

Fine-tune models with LoRA, build RAG pipelines, and implement diffusion models while managing the hardware constraints of distributed training and model sharding.

Updated Mar 16, 2026

About this course

Most people see Generative AI as a sudden shift that happened overnight. It is actually a progression that started in 1956 and only recently found the right balance of math and hardware. We start by looking at the history of artificial neurons and the early AI winters where interest dried up because models could only solve simple patterns. You will learn how the field moved away from hand-written rules and toward statistical probability, eventually turning language into a series of numbers called tokens. This foundation explains why models sometimes fail at basic logic: they are not reading words, they are calculating the next most likely piece of data in a sequence. Once you understand how words become vectors, we look at the Transformer architecture. This is the engine behind tools like ChatGPT. Unlike older models that had to read one word at a time, the Transformer uses self-attention to see an entire paragraph at once. We follow this logic into scaling laws and model families, comparing why a model like BERT is better for understanding while GPT is better for writing. The course then expands beyond text to cover multimodal systems like CLIP and Whisper, and the diffusion models used to generate images. You will see how these systems use noise and reverse denoising to build structure out of randomness. Building these systems is an engineering challenge as much as a mathematical one. We cover instruction fine-tuning to turn a raw autocomplete engine into a helpful assistant, and techniques like LoRA to update models without needing a supercomputer. We also cover orchestration, which is the process of building complex software using tools like LangChain, RAG, and autonomous agents. By the end, you will be able to design retrieval systems that use your own data, fine-tune models for specific tasks, and manage the distributed workloads needed to train large systems across multiple GPUs.

Details

Last updated Mar 16, 2026
9 Units, 101 lessons
3 Projects

Syllabus

9 Units • 101 Lessons • 3 Projects

Ways To Learn Included

Every lesson enables you to learn in a variety of ways.

3 min read
587 words

These gases, such as carbon dioxide and methane, play a crucial role in regulating Earth's temperature. But what exactly are they, and how do they work? Let's find out.

Read
Carbon Dioxide
Flashcards
Quiz
What is the primary greenhouse gas responsible for trapping heat?
Carbon Dioxide
Locked In
Great job! That's the correct answer.
Quiz
0:05
Jam
Arcade

FAQ

Course thumbnail