Learn to run and interact with LLMs locally using powerful tools. Explore LLM deployment, RAG, and ethical considerations.
Learn to run and interact with LLMs locally using powerful tools. Explore LLM deployment, RAG, and ethical considerations.
This course provides a comprehensive introduction to running Large Language Models (LLMs) locally. You'll learn to set up a local environment using powerful tools to run different LLMs and interact with them via web interfaces and APIs. The course covers LLMOps, production workflows, performance evaluation, and responsible AI deployment. You'll gain hands-on experience with tools like llamafile, Hugging Face Candle, and Mozilla llamafile, and explore techniques like Retrieval Augmented Generation (RAG). The curriculum also addresses ethical considerations and strategies for responsible generative AI implementation.
Instructors:
English
What you'll learn
Set up and manage local environments for running Large Language Models (LLMs)
Use tools like llamafile, Hugging Face Candle, and Mozilla llamafile for LLM deployment
Implement Retrieval Augmented Generation (RAG) techniques to improve LLM context and performance
Evaluate real-world performance of LLMs using methods like Elo ratings
Explore production LLM workflows using tools such as skypilot, Lorax, and Ludwig
Understand and apply strategies for responsible generative AI deployment
Skills you'll gain
This course includes:
2.5 Hours PreRecorded video
7 quizzes, 2 assignments
Access on Mobile, Tablet, Desktop
FullTime access
Shareable certificate
Closed caption
Get a Completion Certificate
Share your certificate with prospective employers and your professional network on LinkedIn.
Created by
Provided by
Top companies offer this course to their employees
Top companies provide this course to enhance their employees' skills, ensuring they excel in handling complex projects and drive organizational success.
There are 3 modules in this course
This course offers a comprehensive exploration of running Large Language Models (LLMs) locally. Students will learn to set up and manage local environments for LLMs using advanced tools and techniques. The curriculum covers three main areas: Local LLMOps, Production Workflows and Performance of LLMs, and Responsible Generative AI. Participants will gain hands-on experience with tools like llamafile, Hugging Face Candle, and Mozilla llamafile, and learn techniques such as Retrieval Augmented Generation (RAG). The course also addresses ethical considerations and strategies for responsible AI deployment, providing a well-rounded understanding of local LLM implementation and management.
Local LLMOps
Module 1 · 9 Hours to complete
Production Workflows and Performance of LLMs
Module 2 · 11 Hours to complete
Responsible Generative AI
Module 3 · 2 Hours to complete
Fee Structure
Payment options
Financial Aid
Instructors
Executive in Residence and Founder of Pragmatic AI Labs at Duke University
Noah Gift is the founder of Pragmatic AI Labs and serves as an Executive in Residence at Duke University, where he lectures in the Master of Interdisciplinary Data Science (MIDS) program. He specializes in designing and teaching graduate-level courses on machine learning, MLOps, artificial intelligence, and data science, while also consulting on machine learning and cloud architecture for students and faculty. A recognized expert in the field, Gift is a Python Software Foundation Fellow and an AWS Machine Learning Hero, holding multiple AWS certifications, including AWS Certified Solutions Architect and AWS Certified Machine Learning Specialist. He has authored several influential books, such as Practical MLOps, Python for DevOps, and Pragmatic AI, and has published over 100 technical articles across various platforms, including Forbes and O'Reilly. His extensive industry experience includes roles as CTO and Chief Data Scientist for notable companies like Disney Feature Animation, Sony Imageworks, and AT&T, contributing to major films like Avatar and Spider-Man 3. Gift's work has generated millions in revenue through product development on a global scale. He actively consults startups on machine learning and cloud architecture while leading initiatives to enhance data science education.
Adjunct Assistant Professor at Duke University
Dr. Alfredo Deza is an Adjunct Assistant Professor in the Pratt School of Engineering at Duke University, where he teaches courses on machine learning, programming, and data engineering. He has been involved in academia for several years, focusing on innovative teaching methods and practical applications of technology. Dr. Deza co-authored the book Practical MLOps and has published several other works related to Python and machine learning. His teaching includes courses such as Python Bootcamp and advanced data engineering topics, and he actively develops online courses available on platforms like Coursera. In addition to his academic role, Dr. Deza works in developer relations at Microsoft, leveraging his extensive experience in software engineering and cloud computing to enhance educational content and support for students and faculty. He collaborates with various universities worldwide, including Georgia Tech and Carnegie Mellon University, to promote knowledge sharing in the field of technology and data science.
Testimonials
Testimonials and success stories are a testament to the quality of this program and its impact on your career and learning journey. Be the first to help others make an informed decision by sharing your review of the course.
Frequently asked questions
Below are some of the most commonly asked questions about this course. We aim to provide clear and concise answers to help you better understand the course content, structure, and any other relevant information. If you have any additional questions or if your question is not listed here, please don't hesitate to reach out to our support team for further assistance.