Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use a local LLM proxy to aggregate AI models #93

Open
ga-it opened this issue Apr 21, 2024 · 0 comments
Open

Use a local LLM proxy to aggregate AI models #93

ga-it opened this issue Apr 21, 2024 · 0 comments
Labels
enhancement New feature or request

Comments

@ga-it
Copy link

ga-it commented Apr 21, 2024

Describe the feature you'd like to request

Allow administrator to set up access to multiple LLMs in nextcloud via proxy

Describe the solution you'd like

LITELLM does an incredible job of this.

Combination in Nextcloud would allow more options than the current OpenAI, LocalAI and Replicate options but would also provide a single AI abstraction layer and allow integration with Nextcloud user management, etc

LiteLLM's model aliases provide a useful way to have and call one proxy with unlimited Apis beneath.

The integration could allow integration with Nextcloud group permissions and tasks, allowing other apps to call the various AI models via alias and permissions through the abstraction layer.

If the aliases are exposed as bots then they could be called in Talk, etc

This dovetails with the following Nextcloud Assistant enhancement request - nextcloud/assistant#76

Describe alternatives you've considered

Currently calling LiteLLM via the Nextcloud LocalAI integration

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant