-
Notifications
You must be signed in to change notification settings - Fork 755
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No memory past first conversation for local models? #631
Comments
I think there may be an issue with the local model service you're using. I'm confident that ChatGPTBox has passed history messages through the API. Perhaps you can check if there are any log files. |
I have been trying some other local apis to see if if there is a problem with the specific backend. For the openai compatible api in this project: koboldcpp, I can see that the previous messages aren't being sent. kobold.cpp
Here's a normal log of what should happen:
Here's my logs with the llama.cpp server binary: llama.cpp server
are you able to reproduce it with any of the llama.cpp / ollama backends? am I using the wrong api url? |
my settings seem ok, here is my video footage of the issue: is it chrome or some other operating system issue? I was actually able to use the extension on an android phone, with the firefox browser. the auto queries it makes with searches work great. |
Refresh the conversation page, does the history messages still exist? |
This is not normal. If an answer is completed normally, the conversation page should save it correctly, and then when you continue the conversation, it will be sent as a history message. If it disappears after refreshing, it means that this answer has not been considered complete. ChatGPTBox does not store or send failed or interrupted answers as history messages, which is the same situation you encountered. For me, using ollama answers can be completed and stored correctly. |
Some notable things with tabbyapi (and other ones):
chatgptbox_tabbyAPI.mp4 |
"</" token is actually "</>", and it's rendered as a html element, so not displayed |
Thank you, the conversation now is stored properly and works with these local APIs mentioned, such as tabbyAPI! |
Describe the bug
问题描述
A clear and concise description of what the bug is.
Any of the chat windows do not support continued conversations for local models. I'm not sure if this is a bug or it has not been implemented. Example:
When using local model apis like https://github.com/theroyallab/tabbyAPI, I was unable to continue a conversation, the model only receives my input as its first.
To Reproduce
如何复现
Steps to reproduce the behavior:
Using firefox, enter local url:
Please complete the following information):
请补全以下内容
Last Updated: February 6, 2024
Additional context
其他
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: