Skip to content

Easy to use nixOS flake with Ollama and OpenWebui

License

randoneering/nix-llama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

nix-llama

An easy to deploy nix flake with Ollama and OpenWebUI.


Requirements

  • GPU support (optional but recommended for faster inference/training):
    • NVIDIA GPU + CUDA toolkit (covered in flake)
    • AMD GPU + ROCm (not covered in flake)
  • System resources:
    • At least 8GB RAM (more for larger models)
    • SSD storage for models

Enabling Flakes

If you haven't enabled flakes yet, add the following to your Nix configuration:

# Add this in your configuration.nix
nix.settings.experimental-features = [ "nix-command" "flakes" ];
# Run after updating your configuration.nix
sudo nixos-rebuild switch

# After this, you will need to run the following command when you make changes to your flake
sudo nixos-rebuild switch --flake .#hostname

Installation of this flake

  1. Clone the repository:

    git clone https://github.com/randoneering/nix-llama.git
    cd nix-llama
  2. After adding your configurations:

    sudo nixos-rebuild switch --flake .#yourhostname
  3. Login to OpenWebUI and have fun! Access OpenWebUI through http://yourserverip:8080


Notes

  • Hardware acceleration: The setup includes CUDA/ROCm support. If you're using a CPU-only setup, disable GPU-related dependencies in flake.nix.

Contributing

  • Report issues or request features via GitHub Issues.
  • Submit pull requests for improvements or "bug" fixes

License

This project is licensed under the GPL-v3. See the LICENSE file for details.


About

Easy to use nixOS flake with Ollama and OpenWebui

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages