Use the new GPT-4 api to build a chatGPT chatbot for multiple Large PDF files.
Tech stack used includes LangChain, Chroma, Typescript, Openai, and Next.js. LangChain is a framework that makes it easier to build scalable AI/LLM apps and chatbots. Chroma is a vectorstore for storing embeddings and your PDF in text to later retrieve similar docs.
Join the discord if you have questions
The visual guide of this repo and tutorial is in the visual guide folder.
If you run into errors, please review the troubleshooting section further down this page.
Prelude: Please make sure you have already downloaded node on your system and the version is 18 or greater.
-
Install Docker Desktop for your platform.
-
Run Mistral with Ollama
ollama run mistral
- Clone the repo or download the ZIP
git clone [github https url]
- Install packages
First run npm install yarn -g to install yarn globally (if you haven't already).
Then run:
yarn install
After installation, you should now see a node_modules folder.
- Set up your
.envfile
- Copy
.env.exampleinto.envYour.envfile should look like this:
COLLECTION_NAME=
- Choose a collection name where you'd like to store your embeddings in Chroma. This collection will later be used for queries and retrieval.
- In a new terminal window, run Chroma in the Docker container:
docker run -p 8000:8000 ghcr.io/chroma-core/chroma:0.3.21
This repo can load multiple PDF files
-
Inside
backlogfolder, create another forlder with the collection name. Add there your pdf files or folders that contain pdf files. -
Run the script
npm run ingestto 'ingest' and embed your docs. If you run into errors troubleshoot below.
Once you've verified that the embeddings and content have been successfully added to your Pinecone, you can run the app npm run dev to launch the local dev environment, and then type a question in the chat interface.
General errors
- Make sure you're running the latest Node version. Run
node -v - Try a different PDF or convert your PDF to text first. It's possible your PDF is corrupted, scanned, or requires OCR to convert to text.
Console.logtheenvvariables and make sure they are exposed.- Make sure you're using the same versions of LangChain and Pinecone as this repo.
- Check that you've created an
.envfile that contains your valid (and working) API keys, environment and index name. - If you change
modelNameinOpenAI, make sure you have access to the api for the appropriate model. - Make sure you have enough OpenAI credits and a valid card on your billings account.
- Check that you don't have multiple OPENAPI keys in your global environment. If you do, the local
envfile from the project will be overwritten by systemsenvvariable. - Try to hard code your API keys into the
process.envvariables if there are still issues.
Frontend of this repo is inspired by langchain-chat-nextjs