From the course: Build with AI: LLM-Powered Applications with Streamlit

Unlock this course with a free trial

Join today to access over 25,200 courses taught by industry experts.

Send user prompts to an LLM and display the response

Send user prompts to an LLM and display the response - Python Tutorial

From the course: Build with AI: LLM-Powered Applications with Streamlit

Send user prompts to an LLM and display the response

- [Instructor] So far in this chapter, you have taken your time to learn about LLMs and RAG, along with learning how to work with AI and APIs. Now, this is where the magic happens. You'll use what you learn so far to integrate OpenAI's Chat API within your Streamlit applications. Let's get started. Let's work with the file 02_05b.py. So you note that we are now in the Chapter_2 folder. Let's begin by importing your packages. So you want to import streamlit as st, and then you'll want to import the OpenAI package. Let's do it as from openai import OpenAI. Note that there are a few versions of how you can import this particular package, but this version should be compatible with this Python workspace in Codespaces. Remember, you will want to ensure your API key is safely stored outside of your code to ensure it does not get accidentally shared when you collaborate with others. For this course, you'll want to ensure you open this openai_key.txt file, and this will be in each of the…

Contents