Generative AI with Large Language Models
Course Overview
In this intermediate-level course, you will dive deep into Generative AI and the use of Large Language Models (LLMs). You will explore the entire lifecycle of LLM-based AI systems—from data gathering and model selection to training, fine-tuning, performance evaluation, and deployment. This course is designed to help developers gain the practical skills necessary to make informed decisions about Generative AI and quickly build working prototypes.
Throughout the course, you will also delve into the transformer architecture that powers LLMs and the ways in which they are trained and adapted to meet specific business needs. You will learn how to use empirical scaling laws to optimize a model's performance while managing dataset size, compute budgets, and inference requirements. By the end of the course, you’ll be equipped with the tools and techniques to deploy Generative AI systems in real-world applications, with insights into business applications and industry practices.
What You'll Learn
- Foundational Knowledge: Learn how Generative AI works, including the lifecycle of LLMs and the key steps from data gathering to deployment.
- Deep Dive into Transformer Architecture: Understand the core architecture of LLMs and how fine-tuning enhances their adaptability to specific use cases.
- Practical AI Application: Learn how to implement state-of-the-art training, tuning, and deployment methods to optimize performance within project constraints.
- Real-World Business Use Cases: Explore the challenges and opportunities that Generative AI brings to businesses through insights from industry experts.

49 USD

Coursera
https://www.coursera.org
Skills You'll Gain:


