0

I want run ollama with docker-compose and using nvidia-gpu. What should I write in the docker-compose.yml file?

I run ollama with docker-compose, but gpu was not been used, this is what i write:

  ollama:
    container_name: ollama
    image: ollama/ollama:rocm
    ports:
      - 11434:11434
    volumes:
      - ollama:/root/.ollama
    networks:
      - fastgpt
    restart: always

I need a docker-compose.yaml file example.

1 Answer 1

6

I'm assuming that you have the GPU configured and that you can successfully execute nvidia-smi.

If do then you can adapt your docker-compose.yml as follows:

version: "3.9"

services:  
  ollama:
    container_name: ollama
    image: ollama/ollama:rocm
    deploy:
      resources:
        reservations:
          devices:
          - driver: nvidia
            capabilities: ["gpu"]
            count: all
    volumes:
      - ollama:/root/.ollama
    restart: always

volumes:
    ollama:

When you bring it up you should see "Nvidia GPU detected via cudart" in the logs.

enter image description here

Sign up to request clarification or add additional context in comments.

1 Comment

I didn't need the :rocm tag of ollama . The 'deploy' part was sufficient

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.