Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama API: fails to insert prompt. #143

Closed
Mikilio opened this issue Sep 28, 2024 · 58 comments
Closed

Ollama API: fails to insert prompt. #143

Mikilio opened this issue Sep 28, 2024 · 58 comments
Assignees
Labels
bug Something isn't working
Milestone

Comments

@Mikilio
Copy link

Mikilio commented Sep 28, 2024

The problem

Sometimes, without warning, the chat window fails to insert the prompt. The chat window is functional, however. The error is not easily reproducible has it seems to happen randomly.

Which Operative System are you using?

NixOS

Which version of Thunderbird are you using?

Thunderbird 128.1.1esr

Which version of ThunderAI has the issue?

v2.2.0pre4

Which integration are you using?

Ollama API

Anything in the Thunderbird console logs that might be useful?

Nothing of note.

Additional information

I have managed to catch the error in debug by setting a breakpoint at this line

the variable i18nStrings was set to:

{
  error_connection_interrupted: "The connection to the server was unexpectedly interrupted",ollama_api_request_failed: "Ollama API request failed",
}
@micz
Copy link
Owner

micz commented Sep 28, 2024

The i18nStrings var is used to pass to the worker the i18n strings for the error, they are always set that way at these lines:

let i18nStrings = {};
i18nStrings["ollama_api_request_failed"] = browser.i18n.getMessage('ollama_api_request_failed');
i18nStrings["error_connection_interrupted"] = browser.i18n.getMessage('error_connection_interrupted');

When the prompt is not inserted, there is an error in the log?

@Mikilio
Copy link
Author

Mikilio commented Sep 28, 2024

No error in the log. Only unrelated warnings.

@micz
Copy link
Owner

micz commented Sep 28, 2024

If I post a version here with more logging, can you try it?

@Mikilio
Copy link
Author

Mikilio commented Sep 28, 2024

Yeah, sure, I'll just have to replace the source link. I'm using your stuff for free, it's the least I could do.

@micz
Copy link
Owner

micz commented Sep 29, 2024

You can find here version 2.2.0_i143_v1.

Please follow this steps:

  1. Download the file thunderai-v2.2.0_i143_v1.zip
  2. Rename it from *.zip to *.xpi
  3. Backup your custom prompts if you have any
  4. Install this version

I've added some console.log statement to check what's happening.
Since the chat is an html page, and Hyprland had problems with the new html menu, may you try to replicate the bug also in another windows manager, like KDE?

If you need the source code, you can find it in the branch: https://github.com/micz/ThunderAI/tree/issue_143 (full diff)

Thank you.

@Mikilio
Copy link
Author

Mikilio commented Sep 29, 2024

I'm sorry I can't even test it because on that version I seem to have the issue #137

There seems to be an exception at:

close (resource://gre/modules/ConduitsChild.sys.mjs#143)
unload (resource://gre/modules/ExtensionCommon.sys.mjs#1019)
unload (resource://gre/modules/ExtensionPageChild.sys.mjs#281)
unload (resource://gre/modules/ExtensionPageChild.sys.mjs#324)
destroyExtensionContext (resource://gre/modules/ExtensionPageChild.sys.mjs#496)
observe (resource://gre/modules/ExtensionPageChild.sys.mjs#397)

So before that is solved, I can't test for the actual error of this issue.
I'd like to test for KDE, but that is quite a lot more effort for me currently, so it will have to wait.

EDIT: for some reason the window is responsive and just fails to render, I was able to reproduce the error of this issue and the log is:

(>>>>>> [ThunderAI] Ollama init done. controller.js:96:17
>>>>>> [ThunderAI] appendUserMessage: Attempting to connect to the Ollama Local Server using the host "http://127.0.0.1:11434" and model "llama3.2:latest"... messagesArea.js:146:17
>>>>>> [ThunderAI] appendUserMessage done. messagesArea.js:164:17
>>>>>> [ThunderAI] messagesArea.appendUserMessage done. controller.js:102:17

@micz
Copy link
Owner

micz commented Sep 29, 2024

I haven't made any changes to this version regarding the dynamic menu.

Even with a blank menu, you can trigger a prompt using CTRL+ALT+A to open the menu. Then, press a number and hit Enter.
Are you able to launch a prompt this way and replicate the error?

@micz
Copy link
Owner

micz commented Sep 29, 2024

Edit for some reason the window is responsive and just fails to render, I was able to reproduce the error of this issue and the log is:

Do you see this message:

Attempting to connect to the Ollama Local Server using the host "http://127.0.0.1:11434" and model "llama3.2:latest"...

on the chat?
But the prompt is not added afterwards and no error is displayed?

@Mikilio
Copy link
Author

Mikilio commented Sep 29, 2024

yesss
That is what is happening. I see this message posted by role: "Information"

@Mikilio
Copy link
Author

Mikilio commented Sep 29, 2024

Also regarding #137 it happens about 25% of the time.

@micz
Copy link
Owner

micz commented Sep 29, 2024

yesss That is what is happening. I see this message posted by role: "Information"

Please keep the debug option active.

Do you see this message "[Ollama API] Connection succeded!" in the log?

@micz
Copy link
Owner

micz commented Sep 29, 2024

Also regarding #137 it happens about 25% of the time.

I think I can't really do anything about it. It seems to be something between Hyprland and Thunderbird.

@Mikilio
Copy link
Author

Mikilio commented Sep 29, 2024

I have all options active. That particular message did not happen even once. Not even on seemingly successful runs.

Do you see this message "[Ollama API] Connection succeded!" in the log?

In fact I do not even see any messages from Ollama API

micz added a commit that referenced this issue Sep 29, 2024
@micz
Copy link
Owner

micz commented Sep 29, 2024

I get that message everytime.

Please try this one, I added more logs:
thunderai-v2.2.0_i143_v2.zip

@Mikilio
Copy link
Author

Mikilio commented Sep 29, 2024

Unfortunately, exactly the same. Because of the hash, I can verify I am indeed using a different version, but it's the exact same behavior.

@micz
Copy link
Owner

micz commented Sep 29, 2024

You don't see even a message starting with ">>>>>> [ThunderAI] Ollama API about to send message to createdTab3.id"?
Or "Ollama API window opening..."?

@Mikilio
Copy link
Author

Mikilio commented Sep 29, 2024

(>>>>>> [ThunderAI] Ollama init done. controller.js:96:17
>>>>>> [ThunderAI] appendUserMessage: Attempting to connect to the Ollama Local Server using the host "http://127.0.0.1:11434" and model "llama3.2:latest"... messagesArea.js:146:17
>>>>>> [ThunderAI] appendUserMessage done. messagesArea.js:164:17
>>>>>> [ThunderAI] messagesArea.appendUserMessage done. controller.js:102:17

This is the exact log with all options enabled that I receive after choosing a promp; nothing else. I can then close the window and no new logs get added.

@Mikilio
Copy link
Author

Mikilio commented Sep 29, 2024

Would it help to give you my thunderbird settings?

@micz
Copy link
Owner

micz commented Sep 29, 2024

In the last version I fixed the line

(>>>>>> [ThunderAI] Ollama init done. controller.js:96:17

Removing the first parenthesis.
It seems that you're not using the new version.

In the Thunderbird addon manager, in the ThunderAI details, may you check the version number you're using?

@Mikilio
Copy link
Author

Mikilio commented Sep 29, 2024

The parenthesis got removed, I just quoted my previous answer. Sorry for that. I am really using the new Version. On nix, I have to specify a hash for each installation. So I know that this one is a different version. Obviously, I can't verify if it's the one you intended to send me.

@Mikilio
Copy link
Author

Mikilio commented Sep 29, 2024

>>>>>> [ThunderAI] Ollama init done. controller.js:96:17
>>>>>> [ThunderAI] appendUserMessage: Attempting to connect to the Ollama Local Server using the host "http://127.0.0.1:11434" and model "llama3.2:latest"... messagesArea.js:146:17
>>>>>> [ThunderAI] appendUserMessage done. messagesArea.js:164:17
>>>>>> [ThunderAI] messagesArea.appendUserMessage done. controller.js:102:17

this is the actual freshly copied log

@micz
Copy link
Owner

micz commented Sep 29, 2024

Are those the only lines in the log for ThunderAI?
With the debug option enabled, there should be many more lines.
The new lines I added are direct console.log statements, so I don't understand how you're seeing those lines you posted, but not the others.

There is nothing before "Ollama init done"?

@Mikilio
Copy link
Author

Mikilio commented Sep 29, 2024

at least during the bug. there are other unrelated lines:

1727646358636	addons.xpi	WARN	Checking /nix/store/m5bcq19qmci0py4fzm3j1az7y5s6bax8-thunderbird-128.1.1esr/lib/thunderbird/distribution/extensions for addons
1727646359732	addons.xpi	WARN	Addon with ID [email protected] already installed, older version will be disabled
1727646359733	addons.xpi	WARN	Addon with ID [email protected] already installed, older version will be disabled
(intermediate value).getAttribute is not a function ExtensionParent.sys.mjs:331:38
1727646360189	[email protected]	WARN	Loading extension '[email protected]': Reading manifest: Warning processing version: version must be a version string consisting of at most 4 integers of at most 9 digits without leading zeros, and separated with dots
1727646360204	[email protected]	WARN	Loading extension '[email protected]': Reading manifest: Warning processing version: version must be a version string consisting of at most 4 integers of at most 9 digits without leading zeros, and separated with dots
sendRemoveListener on closed conduit [email protected] 3 ConduitsChild.sys.mjs:122:13
Layout was forced before the page was fully loaded. If stylesheets are not yet loaded this may cause a flash of unstyled content. msgHdrView.js:4214:7
Key event not available on some keyboard layouts: key=“a” modifiers=“accel,alt” id=“” messenger.xhtml
>>>>>> [ThunderAI] Ollama init done. controller.js:96:17
>>>>>> [ThunderAI] appendUserMessage: Attempting to connect to the Ollama Local Server using the host "http://127.0.0.1:11434" and model "llama3.2:latest"... messagesArea.js:146:17
>>>>>> [ThunderAI] appendUserMessage done. messagesArea.js:164:17
>>>>>> [ThunderAI] messagesArea.appendUserMessage done. controller.js:102:17

@micz
Copy link
Owner

micz commented Sep 29, 2024

This is the log I get from loading the addon to the Ollama response, with debug enabled.

1727646722672	[email protected]	WARN	Loading extension '[email protected]': Reading manifest: Warning processing version: version must be a version string consisting of at most 4 integers of at most 9 digits without leading zeros, and separated with dots
1727646722688	addons.xpi	WARN	Addon with ID [email protected] already installed, older version will be disabled
sendRemoveListener on closed conduit [email protected] [ConduitsChild.sys.mjs:122:13](resource://gre/modules/ConduitsChild.sys.mjs)
1727646722725	[email protected]	WARN	Loading extension '[email protected]': Reading manifest: Warning processing version: version must be a version string consisting of at most 4 integers of at most 9 digits without leading zeros, and separated with dots
1727646722747	[email protected]	WARN	Loading extension '[email protected]': Reading manifest: Warning processing version: version must be a version string consisting of at most 4 integers of at most 9 digits without leading zeros, and separated with dots
1727646723026	[email protected]	WARN	Loading extension '[email protected]': Reading manifest: Warning processing version: version must be a version string consisting of at most 4 integers of at most 9 digits without leading zeros, and separated with dots
[ThunderAI Logger | mzta-options] Options restoring chatgpt_win_height = 800 [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring chatgpt_win_width = 700 [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring default_sign_name = undefined [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring default_chatgpt_lang = [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring reply_type = undefined [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring dynamic_menu_order_alphabet = true [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring dynamic_menu_force_enter = false [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring connection_type = ollama_api [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring ollama_host = http://127.0.0.1:11434/ [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring ollama_model = tinyllama:latest [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring openai_comp_host = [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring openai_comp_model = [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring openai_comp_chat_name = OpenAI Comp [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring do_debug = true [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] Shortcut [Ctrl+Alt+A] registered successfully! [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta_Menus] addMenu: prompt_classify [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta_Menus] addMenu: prompt_reply [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta_Menus] addMenu: prompt_rewrite_formal [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta_Menus] addMenu: prompt_rewrite_polite [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta_Menus] addMenu: prompt_summarize_this [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta_Menus] addMenu: prompt_this [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta_Menus] addMenu: prompt_translate_this [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-popup] Preparing data to load the popup menu: true [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-popup] _prompts_data: [{"id":"prompt_classify","label":"Classify","type":"0"},{"id":"prompt_reply","label":"Reply to this","type":"1"},{"id":"prompt_rewrite_formal","label":"Rewrite formal","type":"2"},{"id":"prompt_rewrite_polite","label":"Rewrite polite","type":"2"},{"id":"prompt_summarize_this","label":"Summarize this","type":"0"},{"id":"prompt_this","label":"Prompt this","type":"2"},{"id":"prompt_translate_this","label":"Translate this","type":"0"}] [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-popup] active_prompts: [{"id":"prompt_classify","label":"Classify","type":"0"},{"id":"prompt_reply","label":"Reply to this","type":"1"},{"id":"prompt_summarize_this","label":"Summarize this","type":"0"},{"id":"prompt_translate_this","label":"Translate this","type":"0"}] [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-popup] tabType: mail [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-popup] filteredData: [{"id":"prompt_classify","label":"Classify","type":"0"},{"id":"prompt_reply","label":"Reply to this","type":"1"},{"id":"prompt_summarize_this","label":"Summarize this","type":"0"},{"id":"prompt_translate_this","label":"Translate this","type":"0"}] [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] Executing shortcut, promptId: prompt_classify [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] [ThunderAI] Prompt length: 300 [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] Ollama API window opening... [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] [ollama_api] prefs.chatgpt_win_width: 700, prefs.chatgpt_win_height: 800 [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] Ollama API window ready. [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] message.window_id: 60 [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] >>>>>> createdTab3.id: 4 [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
>>>>>> [ThunderAI] Ollama init done. [controller.js:96:17](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/api_webchat/controller.js)
>>>>>> [ThunderAI] appendUserMessage: Attempting to connect to the Ollama Local Server using the host "http://127.0.0.1:11434/" and model "tinyllama:latest"... [messagesArea.js:146:17](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/api_webchat/messagesArea.js)
>>>>>> [ThunderAI] appendUserMessage done. [messagesArea.js:164:17](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/api_webchat/messagesArea.js)
>>>>>> [ThunderAI] messagesArea.appendUserMessage done. [controller.js:102:17](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/api_webchat/controller.js)
[ThunderAI Logger | mzta-background] >>>>>> mailMessageId3: 1 [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
>>>>>> [ThunderAI] Ollama API about to send message to createdTab3.id: 4 [mzta-background.js:389:29](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/mzta-background.js)
>>>>>>>>>>>>> controller.js onMessage: {"command":"api_send","prompt":"Classify the following text in terms of Politeness, Warmth, Formality, Assertiveness, Offensiveness giving a percentage for each category. Reply with only the category and score with no extra comments or other text. Reply in the same language. \"This is the body of test email number 4. It was sent.\" ","action":"0","tabId":1,"mailMessageId":1,"do_custom_text":"0"} [controller.js:145:13](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/api_webchat/controller.js)
>>>>>> [ThunderAI] appendUserMessage: Classify the following text in terms of Politeness, Warmth, Formality, Assertiveness, Offensiveness giving a percentage for each category. Reply with only the category and score with no extra comments or other text. Reply in the same language. "This is the body of test email number 4. It was sent." [messagesArea.js:146:17](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/api_webchat/messagesArea.js)
>>>>>> [ThunderAI] appendUserMessage done. [messagesArea.js:164:17](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/api_webchat/messagesArea.js)
[ThunderAI Logger | mzta-background] [Ollama API] Connection succeded!

@Mikilio
Copy link
Author

Mikilio commented Sep 29, 2024

I tried setting a breakpoint at one of the log points and can confirm that it's not being touched

@micz
Copy link
Owner

micz commented Sep 29, 2024

I'm sorry I don't know how it can happen.

@Mikilio
Copy link
Author

Mikilio commented Sep 29, 2024

Leave it open, I'll solve this eventually.

@Mikilio
Copy link
Author

Mikilio commented Sep 30, 2024

So basically this line sometimes fails with: TypeError: can't access dead object

After following it more, the stack trace is:

get principal (resource://gre/modules/ExtensionPageChild.sys.mjs#254)
jsonStringify (resource://gre/modules/ExtensionCommon.sys.mjs#786)
sanitize (resource://gre/modules/ExtensionStorage.sys.mjs#167)
sanitize (chrome://extensions/content/child/ext-storage.js#186)
get (chrome://extensions/content/child/ext-storage.js#319)
callAsyncFunction (resource://gre/modules/ExtensionCommon.sys.mjs#1196)
callAsyncFunction (resource://gre/modules/ExtensionChild.sys.mjs#726)
callAndLog (resource://gre/modules/ExtensionChild.sys.mjs#706)
callAsyncFunction (resource://gre/modules/ExtensionChild.sys.mjs#725)
stub (resource://gre/modules/Schemas.sys.mjs#2954)
<anonymous> (moz-extension://2e1bd1c1-6a49-4880-aca9-98165ed7c968/popup/mzta-popup.js#26)
<anonymous> (moz-extension://2e1bd1c1-6a49-4880-aca9-98165ed7c968/popup/mzta-popup.js#25)

This happened because contetWindow is null.

Can you teach me how I can build the release. I never made an add-on before. I want to fork this code to actually fix this.

@micz
Copy link
Owner

micz commented Oct 1, 2024

@mattcaron Does it happen all the time?
Do you experience this issue with ThunderAI 2.1.5 as well?
Which window manager are you using in Ubuntu?
Thank you.

@mattcaron
Copy link

mattcaron commented Oct 1, 2024

Yes, it is 100% repeatable.
Yes, it happens with 2.1.5. I came here to file a big, found this, and installed the one from this thread.
I'm using XFwm. It's standard Xubuntu.

Thanks.

@micz
Copy link
Owner

micz commented Oct 1, 2024

I installed Xubuntu on a virtual machine, but I wasn't able to reproduce the error.
I tried a new approach, could you try this version?
thunderai-v2.2.0_i143_v4.zip

@micz
Copy link
Owner

micz commented Oct 2, 2024

@mattcaron do you have other add-ons intalled?

@mattcaron
Copy link

@micz I do.

https://github.com/jobisoft/DAV-4-TbSync/
https://github.com/jobisoft/TbSync

Just to avoid any rabbit trails - you say you weren't able to reproduce this on a Xubuntu VM, but I'd like to note some data points:

  1. I am not using the default 24.04 TB install, because it is a snap and a dislike them. I am using the TB from the Mozilla PPA. Instructions follow.
  2. I have compositing and all that other GPU draining stuff turned off (GPUs are for AI, not bling..)

Mozilla PPA instructions:

sudo add-apt-repository ppa:mozillateam/ppa
echo '
Package: thunderbird*
Pin: release o=LP-PPA-mozillateam
Pin-Priority: 1000
' | sudo tee /etc/apt/preferences.d/thunderbird
 
echo '
Package: firefox*
Pin: release o=LP-PPA-mozillateam
Pin-Priority: 1000
' | sudo tee /etc/apt/preferences.d/firefox

sudo apt install firefox thunderbird

I have loaded your new version and the behavior is the same. I have included the log below.

console-export-2024-10-2_12-53-6.txt

@mattcaron
Copy link

Just for giggles, I talked to it a bit and included those logs.

console-export-2024-10-2_12-57-22.txt

And here is the screenshot:

Mozilla Thunderbird_001

@micz
Copy link
Owner

micz commented Oct 2, 2024

I've installed Thunderbird the way you did and evertything works.
This is the "correct" log:

>>>>>>>> [ThunderAI] ollama_api sending I'm ready message... [controller.js:50:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/controller.js)
>>>>>> [ThunderAI] ollama_api I'm ready message sent. [controller.js:56:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/controller.js)
>>>>>> [ThunderAI] ollama_api worker initialized. [controller.js:58:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/controller.js)
[ThunderAI Logger | mzta-background] Ollama API window ready. [mzta-logger.js:35:44](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] message.window_id: 177 [mzta-logger.js:35:44](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] >>>>>> createdTab3.id: 14 [mzta-logger.js:35:44](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/js/mzta-logger.js)
>>>>>> [ThunderAI] Ollama init done. [controller.js:116:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/controller.js)
>>>>>> [ThunderAI] appendUserMessage: Attempting to connect to the Ollama Local Server using the host "http://localhost:11434/" and model "tinyllama:latest"... [messagesArea.js:146:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/messagesArea.js)
>>>>>> [ThunderAI] appendUserMessage done. [messagesArea.js:164:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/messagesArea.js)
>>>>>> [ThunderAI] messagesArea.appendUserMessage done. [controller.js:122:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/controller.js)
[ThunderAI Logger | mzta-background] >>>>>> mailMessageId3: 2 [mzta-logger.js:35:44](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/js/mzta-logger.js)
>>>>>> [ThunderAI] Ollama API about to send message to createdTab3.id: 14 [mzta-background.js:389:29](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/mzta-background.js)
>>>>>>>>>>>>> controller.js onMessage: {"command":"api_send","prompt":"Classify the following text in terms of Politeness, Warmth, Formality, Assertiveness, Offensiveness giving a percentage for each category. Reply with only the category and score with no extra comments or other text. Reply in the same language. \"To spice up your inbox with colors and themes, check out the Themes tab under Settings. Customize Gmail » Enjoy! - The Gmail Team Please note that Themes are not available if you're using Internet Explorer 6.0. To take advantage of the latest Gmail features, please upgrade to a fully supported browser.\" ","action":"0","tabId":1,"mailMessageId":2,"do_custom_text":"0"} [controller.js:167:13](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/controller.js)
>>>>>> [ThunderAI] appendUserMessage: Classify the following text in terms of Politeness, Warmth, Formality, Assertiveness, Offensiveness giving a percentage for each category. Reply with only the category and score with no extra comments or other text. Reply in the same language. "To spice up your inbox with colors and themes, check out the Themes tab under Settings. Customize Gmail » Enjoy! - The Gmail Team Please note that Themes are not available if you're using Internet Explorer 6.0. To take advantage of the latest Gmail features, please upgrade to a fully supported browser." [messagesArea.js:146:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/messagesArea.js)
>>>>>> [ThunderAI] appendUserMessage done. [messagesArea.js:164:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/messagesArea.js)
[ThunderAI Logger | mzta-background] [Ollama API] Connection succeded! [mzta-logger.js:35:44](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/js/mzta-logger.js)

After the [ThunderAI] ollama_api I'm ready message sent. message, the background script is responding:

[ThunderAI Logger | mzta-background] Ollama API window ready. [mzta-logger.js:35:44](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] message.window_id: 177 [mzta-logger.js:35:44](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] >>>>>> createdTab3.id: 14 [mzta-logger.js:35:44](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/js/mzta-logger.js)

Here is where the addon is sending the prompt.
Now I'm going to install the add-ons you pointed out and try again.
It seems the Thunderbird internal messaging system is failing without error.

@mattcaron
Copy link

Well, that is indeed interesting.

Here's a test on my end doing the opposite thing - I've disabled TbSync and the CalDAV/CardDAV provider.

console-export-2024-10-2_16-20-19.txt

No change.

Further, in researching this, it turns out that those plugins are no longer necessary, since TB now supports CalDAV and CardDAV natively - so I've removed them.

Still doesn't work.

But, I have had another idea. This is a really, really old profile. Like, "for as long as TB has existed" levels of old. So I started TB with with -P, created a new profile, and installed just ThunderAI from file - and it works.

I fired up meld and there are a lot of differences.

This makes me wonder if it's some old, stale config parameter.

I'm going to take the old profile and the new profile and apply settings from old->new and see if it breaks.
Hmmm...

@micz
Copy link
Owner

micz commented Oct 2, 2024

Ok, thank you for the feedback.
In my setup it worked even with those add-ons. So it's consistent.

@micz
Copy link
Owner

micz commented Oct 2, 2024

@Mikilio, could it be the same for you as well?

@mattcaron
Copy link

Even better! I had 2 copies of TB running - one with each profile - so that I could manually copy over all the settings.

Old profile - doesn't work
New profile - works
BOTH profiles running (2 windows) - doesn't work in EITHER profile.

100% reproduceable.

Now, this is not a complaint - it is merely data. Since I plan to have the two windows side by side and copy things over setting by setting and will then delete the old profile, this will likely all work out just fine. Or, if I break something, I'll have a clean profile from which we can try and determine which setting causes the breakage.

I'll let you know once this task is complete, but this will likely be tomorrow.

@micz
Copy link
Owner

micz commented Oct 2, 2024

Thank you for your help Matt.
I'll report your findings to the Thunderbird Team.

@mattcaron
Copy link

mattcaron commented Oct 2, 2024 via email

@mattcaron
Copy link

My optimism may have been.. premature.

  1. I started moving the settings, and the prompt sending broke again.
  2. I undid all the config changes. Still broken.
  3. I deleted all my remote calendars and address books. Still broken.
  4. I deleted the profile and created a new one, only setting up one email address. Still broken.
  5. I moved the whole ~/.thunderbird directory let it create a new one. Still broken.

At this point, I have no idea how I managed to make it work for the 5 minutes that it did...

@Mikilio
Copy link
Author

Mikilio commented Oct 3, 2024

I would scrap the idea that it is a deterministic error because under the right conditions I can make it happen completely randomly as well. Maybe some async weird stuff happening in the background. I will continue testing @micz updated code on the weekend.

@mattcaron
Copy link

I think you are correct.

When I have some time, I'll pull this repo and start poking. But I have no idea when that will be.

If there is something specific people want me to test, I am happy to make time - I just don't have a lot of spare time to go spelunking right now.

@micz
Copy link
Owner

micz commented Oct 3, 2024

My problem is that I'm not able to replicate the bug, but I agree with you that there is some async issue causing it.
The missing log lines are probably due to messages being sent from the chat popup before the listener is set up in the background script.

I'll post a version later with that part rewritten for you to test.
Thank you.

@micz
Copy link
Owner

micz commented Oct 3, 2024

This version has the internal messaging rewritten, @jobisoft helped me on this one.

thunderai-v2.2.0_i143_v5.zip

Full diff: 16a034b...2db4277

Please let me know if this one works.
Thanks.

@mattcaron
Copy link

That does seem to work, pretty consistently even. I closed and reopened TB 5 times and it worked all 5 times.

And, I think @Mikilio 's idea that it's a race condition is likely correct..

I would speculate that:

  • @micz is on a machine under 2 years old.
  • @Mikilio is on a machine under 5 years old.

And I know for a fact that I've been messing with this on my laptop, which is pretty weak sauce these days:

Intel(R) Core(TM) i7-7820HQ CPU @ 2.90GHz

Sure, it's got 8 cores, but.. it's from 2017 and is a mobile CPU with a low clock rate, so the resulting high latency means it almost always failed under the old code.

Tomorrow, I'll try installing ThunderAI directly from Addons on my desktop and see what happens. That's a Ryzen 3700X so not ultra modern, but moreso than this.

@micz
Copy link
Owner

micz commented Oct 4, 2024

Thank you for the feedback, I'm happy it's finally fixed.
I'll port this code to branch 2.2.0 and make a, hopefully, final pre-release.
I'll update you here.

My pc is really old, like 2011 old. In the years I've upgraded some parts, but the mainboard and CPU are from that year.
I tested it also on a laptop from 2020 and it always worked.

@micz micz added the bug Something isn't working label Oct 4, 2024
@micz micz self-assigned this Oct 4, 2024
@micz micz added this to the 2.2.0 milestone Oct 4, 2024
@micz
Copy link
Owner

micz commented Oct 4, 2024

I've released version 2.2.0pre5, may you test it?
See the changelog: https://github.com/micz/ThunderAI/releases/tag/v2.2.0pre5
Thank you.

@Mikilio
Copy link
Author

Mikilio commented Oct 4, 2024

Awesome work! I can't reproduce the error of this issue anymore. I do however still have the issue #137 about 25% percent of the time. This happens on every version after pre4. But as things have been going, it may be working on pre4 for the wrong reasons.

@mattcaron Unfortunately, I have to disappoint you regarding your theory as my laptop looks like this:

CPU: AMD Ryzen 7 7840U w/ Radeon 780M Graphics (8) @ 5.289GHz
Memory: 31374MiB

I am in favor of closing this issue, and I'll do some new testing on this version to add my findings to #137 instead.

@mattcaron
Copy link

I second closing this. I have tested the Ollama API with different models on both my laptop and desktop and it works fine.

@Mikilio Indeed, my hypothesis was completely wrong. Nice laptop. ;-)

I had considered upgrading mine, except laptops with 3 drive bays are difficult to come by. This is a Lenovo P51, which has 2 NVMe and 1 SATA. I keep hoping Framework will ship their NVMe expansion module for their 16" machine, which was supposed to have space for 2 2280's, plus another 2280 and a 2240 in the main chassis. But, it's been over a year and no news.

Anyway, @micz thanks for all the help and the quick response!

@micz
Copy link
Owner

micz commented Oct 7, 2024

Fixed in version 2.2.0.
Thank you all!

@micz micz closed this as completed Oct 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

When branches are created from issues, their pull requests are automatically linked.

3 participants