-
Notifications
You must be signed in to change notification settings - Fork 349
Closed
Labels
Description
Add support for the OllamaContainer to simplify running and testing LLMs through Ollama.
What is the new container you'd like to have?
I would like to request support for a new container: OllamaContainer.
- Official Docker Hub image: ollama/ollama
- Ollama Blog: Ollama Docker Image
Why not just use a generic container for this?
The generic DockerContainer("ollama/ollama:latest") approach is not sufficient due to several reasons:
-
Complicated setup/configuration: Ollama can run with GPU acceleration inside Docker containers for Nvidia GPUs. It's important to be able to check the availability of GPUs and run the container with some if possible.
-
Model management: There is also the need to pull a model and commit container changes into an image after pulling a model. So that the image containing the model can be reused later.