A lightweight, self-hosted EPUB reader with a built-in AI chat window. Read any EPUB one chapter at a time, and ask a local LLM (Ollama) or the OpenAI API questions about the current chapterโno copy/paste required.
Inspired by Karpathyโs idea of โreading books with LLMsโ can be found in this repository "karpath/reader3". this project is intentionally simple, hackable, and easy to modify with the help of any LLM.
Below is how the reader appears with integrated chat window..
Response with questions in the chapter...

Response with questions not in the chapter..

- ๐ Clean, chapter-based EPUB reader
- ๐ฌ Integrated chat window on the right
- ๐ค Supports local LLMs (Ollama) or OpenAI GPT models
- ๐ Automatically sends the current chapter text as context
- ๐ LLM is restricted to chapter content only
- ๐งน
.envsupport withpython-dotenv - ๐๏ธ Simple library system (
*_datafolders)
Install all the required dependencies of the file
pip install -r requirements.txtDownload an EPUB (e.g., Dracula from Project Gutenberg) and process it. Place this file in the project folder
python reader3.py dracula.epubThis creates a folder:
dracula_data/
โโโ book.pkl
โโโ images/Each *_data folder becomes a book in your library.
python server.pyvisit: http://localhost:8123/ You'll see local libary and can open any book
Create a .env file in your project folder with below line defining your open ai key.
OPENAI_API_KEY=your_key_here
If omitted, only local LLM model is available