From the course: Introduction to Large Language Models (LLMs) and Prompt Engineering by Pearson
Unlock this course with a free trial
Join today to access over 25,200 courses taught by industry experts.
Batch prompting + prompt chaining
From the course: Introduction to Large Language Models (LLMs) and Prompt Engineering by Pearson
Batch prompting + prompt chaining
Let's move on to two more techniques, Batch Prompting and Prompt Chaining. Both of these prompting techniques involve either using multiple LLMs or multiple data points within a single LLM. Starting with Batch Prompting, the idea is quite simple, to be quite frank. But it's pretty simple. The idea is, instead of standard prompting, where you have an input and an output, maybe you have a few-shot examples here, like on the left, where the screenshot here is calling it k-shot, or a few-shot learning. k is just the number of examples that you have. And you have one sample to perform inference on. This is exactly like our example of subjective or objective. That would be standard prompting with few-shot. A batch prompting would give those few-shot examples, but also give multiple samples at once to predict. So instead of giving one question and getting one answer, you would give two questions and get two answers. You would still get the, you would still include those few-shot examples…