Basic example repository showing how to use ScrapeGraphAI with Ollama in docker compose.
- Docker Compose V2
- Task (3.43+) (optional, but recommended)
Up the ollama
service.
task up
Note
The default model ollama3.2
will be pulled inside the ollama
service. You can create your own .env
file
and change the OLLAMA_MODEL
variable to the desired model (see the Model library).
Tip
If you want to use GPU capabilities, follow the instructions in the ollama/ollama docker repository.
Afterward, you can create a compose.override.yaml
to star the ollama
service with GPU options.
Example for Nvidia GPU:
services:
ollama:
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: all
capabilities: [gpu]
Then run the scrapegraph
module.
task run p="Describe the page" s="https://www.youtube.com/watch?v=dQw4w9WgXcQ"
Response.
{
"content": "The webpage appears to be a YouTube video page for the song 'Never Gonna Give You Up' by Rick Astley. The page includes information such as the video's title, channel, and views. There is also a link to a biography and an audiobook introduction. The page has a playlist section where viewers can add videos, but there seems to be a technical issue with sharing functionality."
}
Important
If you are done and no longer need to use ollama and its volume, make sure to remove it.
task down -- -v --remove-orphans
# or
docker compose down -v --remove-orphans
- βοΈ Pre-commit.
- π Docker Compose V2
- π Task 3.37+
This repository comes with a BSD 3-Clause License.