Local and offline AI interpreter to chat, share and generate images with an AI.
- Ollama
- electron-vite
- A computer 💀
-
Download Ollama from this link
-
Clone the OverAI repo and install the necessary dependencies
git clone https://github.com/Nickeldon/OverAI.git
npm i
cd ./main
npm i
- run the app
npm run dev
-
When OverAI is loaded, choose and downlaod an AI from the listed ones
-
Enjoy
Notes:
- As everything is compiled locally, the AI process will create a heavy GPU load while in use
- Only LLaVa can have an image input from the user
- LLaMa 2 and LLaMa 3 both can generate images from the user's prompt (Coming soon)
- Mistral is mostly used for coding tasks
- The app may have various bugs as it is not finished yet (Way enough for testing though)
- The User must have Ollama installed on the host's computer to use OverAI
- As said earlier, no internet connectivity is needed for the app to operate correctly (Downloading models requires an internet connectivity 💀)