From the course: Advanced Python Projects: Build AI Applications

Unlock the full course today

Join today to access over 25,200 courses taught by industry experts.

Generate chat responses using GPT-3.5 and Langchain

Generate chat responses using GPT-3.5 and Langchain - Python Tutorial

From the course: Advanced Python Projects: Build AI Applications

Generate chat responses using GPT-3.5 and Langchain

- [Instructor] Here we're defining a pedantic model named ChatMessageSent using the base model class. Next, we create a function named get response that takes several parameters. This is the name of the file to load data from. This is the session ID for tracking conversation history. This is the user query or question to be used in the conversation. Here we're using the model GPT-3.5 Turbo 16K, and then we're setting the temperature of the model to zero. Temperature parameter is to control the response randomness. The default is set to Zero, which produces a very stable or less random response. Then we print the file name and update it by extracting the last part after the slash using this line over here. So overall, this function generates a response using a conversational model like GPT-3.5 Turbo. It takes a final name to load data, a session ID to track the conversation, a query or question, and optional parameters for…

Contents