Skip to content

New Container: OllamaContainer #617

@bricefotzo

Description

@bricefotzo

Add support for the OllamaContainer to simplify running and testing LLMs through Ollama.

What is the new container you'd like to have?

I would like to request support for a new container: OllamaContainer.

Why not just use a generic container for this?

The generic DockerContainer("ollama/ollama:latest") approach is not sufficient due to several reasons:

  1. Complicated setup/configuration: Ollama can run with GPU acceleration inside Docker containers for Nvidia GPUs. It's important to be able to check the availability of GPUs and run the container with some if possible.

  2. Model management: There is also the need to pull a model and commit container changes into an image after pulling a model. So that the image containing the model can be reused later.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions