Skip to content

headless-ollama has pre-run scripts which automatically utilises node runtime to check for the host OS and installs the ollama client and the models needed by your app so that users don't have to install ollama themselves

Notifications You must be signed in to change notification settings

nischalj10/headless-ollama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

headless-ollama

the scripts here help you easily install ollama client on any device (mac/linux/windows).

building desktop apps that utilise local LLMs is awesome and ollama makes it wonderfully easy to do so by providing wonderful libraries in js and python to call local LLMs in OpenAI format.

however, the user's system needs to have ollama already installed for your desktop app to use ollama-js/ python libraires. Making users install ollama client separately isn't good UX tbh. thus, "headless-ollama"

this repo has pre-run scripts (see the "predev" command in package.json that runs before npm run dev) which automatically utilises node runtime to check for the host OS and installs the ollama client and the models needed by your app before the server starts.

this is really helpful while building electron apps where you want everything to be local and self contained within the system.

run the following commands to get started

git clone https://github.com/nischalj10/headless-ollama
npm i
npm run dev

it will automatically install ollama client and run the llava:v1.6 model.

llama.cpp support

it has come to my notice that adding llama.cpp support would be great (in some cases better than ollama from packaging pov). i would be picking this up soon as i get some free time. see issue

About

headless-ollama has pre-run scripts which automatically utilises node runtime to check for the host OS and installs the ollama client and the models needed by your app so that users don't have to install ollama themselves

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published