-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama API: fails to insert prompt. #143
Comments
The i18nStrings var is used to pass to the worker the i18n strings for the error, they are always set that way at these lines: ThunderAI/api_webchat/controller.js Lines 85 to 87 in 9366fc5
When the prompt is not inserted, there is an error in the log? |
No error in the log. Only unrelated warnings. |
If I post a version here with more logging, can you try it? |
Yeah, sure, I'll just have to replace the source link. I'm using your stuff for free, it's the least I could do. |
You can find here version 2.2.0_i143_v1. Please follow this steps:
I've added some console.log statement to check what's happening. If you need the source code, you can find it in the branch: https://github.com/micz/ThunderAI/tree/issue_143 (full diff) Thank you. |
I'm sorry I can't even test it because on that version I seem to have the issue #137 There seems to be an exception at:
So before that is solved, I can't test for the actual error of this issue. EDIT: for some reason the window is responsive and just fails to render, I was able to reproduce the error of this issue and the log is:
|
I haven't made any changes to this version regarding the dynamic menu. Even with a blank menu, you can trigger a prompt using CTRL+ALT+A to open the menu. Then, press a number and hit Enter. |
Do you see this message: Attempting to connect to the Ollama Local Server using the host "http://127.0.0.1:11434" and model "llama3.2:latest"... on the chat? |
yesss |
Also regarding #137 it happens about 25% of the time. |
Please keep the debug option active. Do you see this message "[Ollama API] Connection succeded!" in the log? |
I think I can't really do anything about it. It seems to be something between Hyprland and Thunderbird. |
I have all options active. That particular message did not happen even once. Not even on seemingly successful runs.
In fact I do not even see any messages from Ollama API |
I get that message everytime. Please try this one, I added more logs: |
Unfortunately, exactly the same. Because of the hash, I can verify I am indeed using a different version, but it's the exact same behavior. |
You don't see even a message starting with ">>>>>> [ThunderAI] Ollama API about to send message to createdTab3.id"? |
This is the exact log with all options enabled that I receive after choosing a promp; nothing else. I can then close the window and no new logs get added. |
Would it help to give you my thunderbird settings? |
In the last version I fixed the line (>>>>>> [ThunderAI] Ollama init done. controller.js:96:17 Removing the first parenthesis. In the Thunderbird addon manager, in the ThunderAI details, may you check the version number you're using? |
The parenthesis got removed, I just quoted my previous answer. Sorry for that. I am really using the new Version. On nix, I have to specify a hash for each installation. So I know that this one is a different version. Obviously, I can't verify if it's the one you intended to send me. |
this is the actual freshly copied log |
Are those the only lines in the log for ThunderAI? There is nothing before "Ollama init done"? |
at least during the bug. there are other unrelated lines:
|
This is the log I get from loading the addon to the Ollama response, with debug enabled.
|
I tried setting a breakpoint at one of the log points and can confirm that it's not being touched |
I'm sorry I don't know how it can happen. |
Leave it open, I'll solve this eventually. |
So basically this line sometimes fails with: After following it more, the stack trace is:
This happened because contetWindow is null. Can you teach me how I can build the release. I never made an add-on before. I want to fork this code to actually fix this. |
@mattcaron Does it happen all the time? |
Yes, it is 100% repeatable. Thanks. |
I installed Xubuntu on a virtual machine, but I wasn't able to reproduce the error. |
@mattcaron do you have other add-ons intalled? |
@micz I do. https://github.com/jobisoft/DAV-4-TbSync/ Just to avoid any rabbit trails - you say you weren't able to reproduce this on a Xubuntu VM, but I'd like to note some data points:
Mozilla PPA instructions:
I have loaded your new version and the behavior is the same. I have included the log below. |
Just for giggles, I talked to it a bit and included those logs. console-export-2024-10-2_12-57-22.txt And here is the screenshot: |
I've installed Thunderbird the way you did and evertything works.
After the
Here is where the addon is sending the prompt. |
Well, that is indeed interesting. Here's a test on my end doing the opposite thing - I've disabled TbSync and the CalDAV/CardDAV provider. console-export-2024-10-2_16-20-19.txt No change. Further, in researching this, it turns out that those plugins are no longer necessary, since TB now supports CalDAV and CardDAV natively - so I've removed them. Still doesn't work. But, I have had another idea. This is a really, really old profile. Like, "for as long as TB has existed" levels of old. So I started TB with with I fired up meld and there are a lot of differences. This makes me wonder if it's some old, stale config parameter. I'm going to take the old profile and the new profile and apply settings from old->new and see if it breaks. |
Ok, thank you for the feedback. |
@Mikilio, could it be the same for you as well? |
Even better! I had 2 copies of TB running - one with each profile - so that I could manually copy over all the settings. Old profile - doesn't work 100% reproduceable. Now, this is not a complaint - it is merely data. Since I plan to have the two windows side by side and copy things over setting by setting and will then delete the old profile, this will likely all work out just fine. Or, if I break something, I'll have a clean profile from which we can try and determine which setting causes the breakage. I'll let you know once this task is complete, but this will likely be tomorrow. |
Thank you for your help Matt. |
I'm not sure if it is related, though I suspect it is, but notifications on new messages haven't worked for me for about a decade.
|
My optimism may have been.. premature.
At this point, I have no idea how I managed to make it work for the 5 minutes that it did... |
I would scrap the idea that it is a deterministic error because under the right conditions I can make it happen completely randomly as well. Maybe some async weird stuff happening in the background. I will continue testing @micz updated code on the weekend. |
I think you are correct. When I have some time, I'll pull this repo and start poking. But I have no idea when that will be. If there is something specific people want me to test, I am happy to make time - I just don't have a lot of spare time to go spelunking right now. |
My problem is that I'm not able to replicate the bug, but I agree with you that there is some async issue causing it. I'll post a version later with that part rewritten for you to test. |
This version has the internal messaging rewritten, @jobisoft helped me on this one. Full diff: 16a034b...2db4277 Please let me know if this one works. |
That does seem to work, pretty consistently even. I closed and reopened TB 5 times and it worked all 5 times. And, I think @Mikilio 's idea that it's a race condition is likely correct.. I would speculate that: And I know for a fact that I've been messing with this on my laptop, which is pretty weak sauce these days:
Sure, it's got 8 cores, but.. it's from 2017 and is a mobile CPU with a low clock rate, so the resulting high latency means it almost always failed under the old code. Tomorrow, I'll try installing ThunderAI directly from Addons on my desktop and see what happens. That's a Ryzen 3700X so not ultra modern, but moreso than this. |
Thank you for the feedback, I'm happy it's finally fixed. My pc is really old, like 2011 old. In the years I've upgraded some parts, but the mainboard and CPU are from that year. |
I've released version 2.2.0pre5, may you test it? |
Awesome work! I can't reproduce the error of this issue anymore. I do however still have the issue #137 about 25% percent of the time. This happens on every version after pre4. But as things have been going, it may be working on pre4 for the wrong reasons. @mattcaron Unfortunately, I have to disappoint you regarding your theory as my laptop looks like this:
I am in favor of closing this issue, and I'll do some new testing on this version to add my findings to #137 instead. |
I second closing this. I have tested the Ollama API with different models on both my laptop and desktop and it works fine. @Mikilio Indeed, my hypothesis was completely wrong. Nice laptop. ;-) I had considered upgrading mine, except laptops with 3 drive bays are difficult to come by. This is a Lenovo P51, which has 2 NVMe and 1 SATA. I keep hoping Framework will ship their NVMe expansion module for their 16" machine, which was supposed to have space for 2 2280's, plus another 2280 and a 2240 in the main chassis. But, it's been over a year and no news. Anyway, @micz thanks for all the help and the quick response! |
Fixed in version 2.2.0. |
The problem
Sometimes, without warning, the chat window fails to insert the prompt. The chat window is functional, however. The error is not easily reproducible has it seems to happen randomly.
Which Operative System are you using?
NixOS
Which version of Thunderbird are you using?
Thunderbird 128.1.1esr
Which version of ThunderAI has the issue?
v2.2.0pre4
Which integration are you using?
Ollama API
Anything in the Thunderbird console logs that might be useful?
Additional information
I have managed to catch the error in debug by setting a breakpoint at this line
the variable
i18nStrings
was set to:The text was updated successfully, but these errors were encountered: