Modulus-AI is a multi-provider chat platform that offers premium, customizable user experiences. Providing a seamless integration with various AI models, it allows users to interact with multiple AI providers through a single interface. Advance tool usage such as web-search and wiki-search is supported as well as customizable RAGs, enhancing the capabilities of the AI interactions.
- Multi-provider chat interface (Ollama, OpenRouter, Gemini)
- Premium user interface
- Wide set of tools
- Backend: Haskell, Servant, PostgreSQL
- Frontend: React, Typescript, Vite, TailwindCSS
- CI/CD: GitHub Actions
- Code Quality: Pre-commit hooks, HLint, fourmolu
- Deployment: GCP
- Model Serving: Ollama, OpenRouter, Gemini
- Email Service: Mailgun
- Docker
- Docker Compose
- Stack
- Pre-commit
- Mailgun API
- Ollama (optional, for local model serving)
git clone [email protected]:tusharad/modulus-ai.git
cd modulus-ai
cd modulus-ai-be
source export-env.sh
This will add the necessary environment variables to your shell session from the .env.local
file. You must create this file based on the .env.example
file provided in the repository.
make up
This command will start the Docker containers defined in the docker-compose.yml
file. It will set up the necessary services for the application to run including postgres database.
stack run
This command will start the application using Stack, which is a tool for managing Haskell projects. It will compile the project and run the server.
The application should now be running and accessible at http://localhost:8081
.
./scripts/run-tests.sh
cd modulus-ai-fe
npm install
npm run dev
This project is licensed under the MIT License - see the LICENSE file for details.
Contributions are welcome!