Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama models remotely. The goal of Maid is to create a platform for AI that can be used freely on any device.
Windows, Linux, Android. Releases page. MacOS and IOS Releases not available at this time.
To use this app in local mode, follow these steps:
- Download the GGUF model from your chosen source.
- Launch maid.
- Navigate to the model settings by opening the sidebar and pressing the model button.
- Click load model and select and your model file.
- (Optionally) Set the preprompt and alias in the character settings.
- Navigate back to the Home Page
- Enter a prompt
To use this app in remote mode, follow these steps:
- From your computer pull your chosen model using ollama.
- Setup Ollama for hosting on your local network as shown here.
- Launch maid.
- Toggle the remote switch in the navigation drawer.
- Enter the IP address and port of your computer running ollama.
- Set the model name to the name of the model you pulled in the model settings.
- (Optionally) Set the preprompt and alias in the character settings.
- Navigate back to the Home Page.
- Enter a prompt.
- Write code comments
- Documentation
- Testing and Building on MacOS and IOS
- Spreading the word
Android version tested on a Oneplus 10 Pro 11gb phone. Also tested on Debian Linux, Windows 11. Tested with calypso 3b, orcamini 3b, llama 2 7B-Chat and llama 7B.
Please note that the llama.cpp models are owned and officially distributed by Meta. This app only serves as an environment for the model's capabilities and functionality. The developers of this app do not provide the LLaMA models and are not responsible for any issues related to their usage.