Generative AI and LLMs on AWS

Go to edX course page

Master deploying generative AI models like GPT on AWS through hands-on labs. Learn architecture selection, cost optimization, monitoring, CI/CD pipelines, and compliance best practices. Gain skills in operationalizing LLMs using Amazon Bedrock, auto-scaling, spot instances, and differential privacy techniques. Ideal for ML engineers, data scientists, and technical leaders.

Course Highlights:

  • Choose optimal LLM architectures for your applications
  • Optimize cost, performance and scalability with auto-scaling and orchestration
  • Monitor LLM metrics and continuously improve model quality
  • Build secure CI/CD pipelines to train, deploy and update LLMs
  • Ensure regulatory compliance via differential privacy and controlled rollouts
  • Real-world, hands-on training for production-ready generative AI

Unlock the power of large language models on AWS. Master operationalization using cloud-native services through this comprehensive, practical training program.



Starts: N/A
Ends: N/A

Course Summary:

Course Summary
Date Details Due