-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Oobabooga support for local generations? #2
Comments
Can you please test if it works because I don't have that provider Make sure you use a few features like /load and @ to test RAG and if the markdown syntax appears properly and let me know so i can push it to the main script! it looks at http://127.0.0.1:5000 |
ooh this is awesome! thank you! on a quick test right now, I was able to connect through the API on localhost:5000 just fine and it looks like output is being generated by ooba, but nothing is being displayed on the retrochat side. I can dig deeper into the oobaboogaChatSession class tonight and see if I can figure out what's going on with that. |
I'm not a programmer, but adding these lines to the ChatSession class got the output to print in Retrochat
That's from Claude, haha. Does that make any sense to you? Although, for a reason I don't understand this prints asynchronously and then prints a duplicate of the text again at the end. |
oh sure, you probably just need to start the program with the --api flag. that will open the default socket on :5000 and that's how I was able to successfully send and receive with Retrochat in my tests. so however you are launching the WebUI just add edit to add: make sure you have a model loaded in WebUI or you'll get some strange errors when you start the chat in Retrochat. |
Any chance you'd consider adding Oobabooga support as a provider?
https://github.com/oobabooga/text-generation-webui
They have an API that is "drop in compatible" with OpenAI so maybe it wouldn't be too much work?
Retrochat is a super fun tool. Thanks for putting it out there. I'm using it mostly with ollama, but there are some models that run better and faster locally with ooba.
The text was updated successfully, but these errors were encountered: