NVLogo wht bg v2
8173  Reviews star_rate star_rate star_rate star_rate star_half

Rapid Application Development Using Large Language Models

Recent advancements in both the techniques and accessibility of large language models (LLMs) have opened up unprecedented opportunities for businesses to streamline their operations, decrease...

Read More
$500 USD
Course Code NV-RAD-LLM
Duration 1 day
Available Formats Classroom

Recent advancements in both the techniques and accessibility of large language models (LLMs) have opened up unprecedented opportunities for businesses to streamline their operations, decrease expenses, and increase productivity at scale. Enterprises can also use LLM-powered apps to provide innovative and improved services to clients or strengthen customer relationships. For example, enterprises could provide customer support via AI virtual assistants or use sentiment analysis apps to extract valuable customer insights.

In this course, you’ll gain a strong understanding and practical knowledge of LLM application development by exploring the open-sourced ecosystem, including pretrained LLMs, that can help you get started quickly developing LLM-based applications.

Skills Gained

  • Find, pull in, and experiment with the HuggingFace model repository and the associated transformers API
  • Use encoder models for tasks like semantic analysis, embedding, question-answering, and zero-shot classification
  • Use decoder models to generate sequences like code, unbounded answers, and conversations
  • Use state management and composition techniques to guide LLMs for safe, effective, and accurate conversation

Prerequisites

  • Introductory deep learning, with comfort with PyTorch and transfer learning preferred. Content covered by DLI’s Getting Started with Deep Learning or Fundamentals of Deep Learning courses, or similar experience is sufficient.
  • Intermediate Python experience, including object-oriented programming and libraries. Content covered by Python Tutorial (w3schools.com) or similar experience is sufficient.

Course Details

Course Outline

  • Introduction
  • From Deep Learning to Large Language Models
  • Specialized Encoder Models
  • Encoder-Decoder Models for Seq2Seq
  • Decoder Models for Text Generation
  • Stateful LLMs
  • Assessment and Q&A