Skip to content

The Open & privacy-first ChatGPT Enterprise alternative. Work in progress.

License

Notifications You must be signed in to change notification settings

llamaworkspace/llamaworkspace

Repository files navigation

Llama Workspace Logo

An extensible ChatGPT Teams/Enterprise alternative

Warning

Both the repo and this README is still a work in progress. Some of the benefits stated are not yet implemented and the product is not yet ready for production.

What is it

An open source & extensible ChatGPT Teams/Enterprise alternative, designed for organizations to provide generative AI access to their teams, while still retaining full control of their data.

Main features

  • User management: Manage people access. Decide who has access to what with a granular permissions system.
  • All LLM vendors in one place: Ask questions to GPT-4, Claude, Mistral or any other major LLM from a single place. Ollama and LLamaCpp are also supported.
  • Apps / GPTs: Create apps for repeatable use cases and share them across your teams.
  • Talk to your documents and data: Upload your documents and ask questions to them.
  • Extend with other AI tools: Bring all your company's AI toolkit to a single place. Integrate with Flowise, Gradio, Hugging Face, Langchain and more.
  • Straightforward self-hosting: Get up and running with a fewest possible number of commands.

Getting started

Vercel

To deploy on Vercel, follow these steps:

  1. Create a project by clicking on Add New... > Project.
  2. Select Import Third-Party Git Repository and enter the URL of this repository.
  3. Insert the environment variables. To do so, use .env.example as a reference for the variables to fill in. You'll need to set the DATABASE_URL variable to point to your Postgres database, which you can provision with Vercel.
  4. Deploy the project.
  5. Set up your domain to point to the Vercel deployment.

Fully self-hosted

To self host a Llama Workspace app you'll need to follow the next steps:

  1. Provision a Postgres database. The details may vary based on your stup.
  2. Clone or copy this repository.
  3. Create an .env file based on the .env.example file. You'll need to set the DATABASE_URL variable to point to your Postgres database.
  4. Install the dependencies by running npm install.
  5. Build the NextJS app by running npm run production:build. This will prepare NextJS to be built and run the build itself.
  6. Run a post-install script by running npm run production:postbuild. This script will run the migrations.
  7. Bootstrap the app by running npm run production:start.

Feedback

We are happy to hear your valuable feedback. For this purpose, we have created a Discord channel where you can share your thoughts and ideas. Join the channel here.

Roadmap

We welcome feedback from our community. To stay up to date with all the latest news and product updates or to reach us, follow us on X (formerly Twitter).

License & Trademarks

Llama Workspace is open source under the GNU Affero General Public License Version 3 (AGPLv3) or any later version.