From the course: Hands-On AI: Building LLM-Powered Apps
Unlock the full course today
Join today to access over 25,200 courses taught by industry experts.
Prompts and prompt templates - Python Tutorial
From the course: Hands-On AI: Building LLM-Powered Apps
Prompts and prompt templates
- [Instructor] Let's continue our discussion about large language models. We know now they have emergent abilities that we can instruct them to do many, many different tasks. The way we instruct them to perform tasks is by using prompts. A prompt is a user-defined input to which the LLM is meant to respond. In case of OpenAI models, we will be using a specific markup language called Chat Markup Language, or ChatML In ChatML, we separate the prompt into arrays of rows and contents. There are three type of rows, system, assistant, and user System is the system prompt where we provide instruction to the whole system. This is usually set at the beginning of a session and not changed. The assistant role is the language model's response. And the user role is for user inputs. So in a conversation, we can expect the list of prompts goes from system to user, and then the language model responds as assistant, and then user replies back as user, taking turns in making the discussions. To…
Contents
-
-
-
Language models and tokenization4m 53s
-
Large language model capabilities1m 48s
-
(Locked)
Challenge: Introduction to Chainlit2m 28s
-
(Locked)
Solution: Introduction to Chainlit solution1m 18s
-
(Locked)
Prompts and prompt templates3m
-
(Locked)
Obtaining an OpenAI token1m 20s
-
(Locked)
Challenge: Adding an LLM to the Chainlit app1m 31s
-
(Locked)
Solution: Adding an LLM to the Chainlit app3m 20s
-
(Locked)
Large language model limitations3m 43s
-
-
-
-