---
config:
layout: dagre
look: neo
---
flowchart LR
subgraph API_Container["API Container"]
ui["Demo UI<br>(heartbot.angelajt.com)<br>client.py"]
api["<b>API and docs</b><br>(heartbot.angelajt.com/docs)<br>api.py"]
heartbot["Heartbot library<br>heartbot.py"]
startup["Startup Script<br>run.sh"]
server["main.py"]
end
subgraph Caddy_Container["Caddy Container"]
acme["ACME Client"]
proxy["HTTPS Reverse Proxy"]
end
subgraph Host_VM["Host VM at GCP"]
API_Container
Caddy_Container
end
browser["Demo App <br>in Browser"] --> proxy
phoneapp["Phone App"] --> proxy
proxy --> api
ui --> api
api --> heartbot
heartbot --> ca["Conversational Agents"]
startup --> server
server --> api
browser@{shape: circle}
style API_Container fill:#00C853
style Caddy_Container fill:#00C853
style ca fill:#BBDEFB
This is a demo UI for testing purposes, found at heartbot.angelajt.com. It lets us interact with the Heartbot API through a web-based interface, accessible through a web browser.
- Communicates with api.py through JSON messages
The API layer, built with FastAPI, serves the primary endpoints used by the frontend (phone app). It also serves interactive documentation (Swagger UI) available at heartbot.angelajt.com/docs. It acts as the bridge between the user interface and Heartbot core library.
The API layer will be accessed directly by the production phone app.
- Calls heartbot.py
chat_with_modulefunction
This is the core of the heartbot application.
- Handles Conversational Agents credential secrets, for security (we don't want the API key on the client side / participants' phones)
- Accesses the different module agents
- Takes user input and sends it to Conversational Agents
- Receives responses from Conversational Agents
- Handles individual chat sessions
- Parses
[image]tags (so that the app can show images in messages) - Parses
[end]tags (so that we know when the user finishes a module)
The library communicates with Conversational Agents through a secure HTTPS connection.
This is the Docker container startup script. It is responsible for setting up the runtime environment, pulling necessary environment variables, and launching the FastAPI server inside the container.
This script starts up the FastAPI application defined in api.py. api.py in
turn calls heartbot.py, as mentioned above.
Conversational Agents provides only the LLM functionality that heartbot.py
depends on. The Heartbot library itself is in heartbot.py.
This is a Docker container that packages and isolates the application
components (client.py, api.py, heartbot.py, main.py, and run.sh).
Encapsulating everything in a Docker container enables reproducible
deployments.
A separate Docker container running Caddy, a web server that handles HTTPS termination and reverse proxying to the API container. It ensures secure communication between clients and the API. It also fetches and manages TLS certificates from Let's Encrypt automatically.
A virtual machine running on Google Cloud Platform (GCP). It hosts the Docker container, and serves as the infrastructure base for running Heartbot’s application stack.