Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Manually downloading a LLM #527

Open
1 task
ekuznetsov139 opened this issue Dec 15, 2024 · 5 comments
Open
1 task

Manually downloading a LLM #527

ekuznetsov139 opened this issue Dec 15, 2024 · 5 comments
Assignees
Labels
app:desktop application app:pieces os bug Something isn't working os:linux status:triaged The ticket has been reviewed, prioritized, categorized, and assigned to the appropriate team member. status:waiting_on_user_responses Additional information is required from the user to proceed.

Comments

@ekuznetsov139
Copy link

Software

Desktop Application

Operating System / Platform

Linux

Your Pieces OS Version

Latest

Early Access Program

  • Yes, this is related to an Early Access Program feature.

Kindly describe the bug and include as much detail as possible on what you were doing so we can reproduce the bug.

I am attempting to install Pieces with a local LLM. Unfortunately, I have an unstable network connection (which is why I want a local model in the first place.) I am trying to get Mistral-7B.

I found that Pieces wants to download the whole model (4 GB) all at once. If it fails for whatever reason, it restarts from the beginning. This is annoying and it makes downloading virtually impossible, because my network connection does not stay long enough to pull 4GB in one go.

I have figured out where the file is being downloaded to and from, and managed to download it manually with wget. Now the problem is that Pieces refuses to recognize it. Even if I put it in the right place, the Pieces app stil shows the Download button in the UI, and clicking Download truncates the file to zero and starts the download from the beginning.

I tried to edit models.db and change "downloaded" to true in the relevant line, but it has no effect.

Is there anything else I could do?

@hra42
Copy link

hra42 commented Dec 16, 2024

I don't think you could import a model you downloaded manually. The problem is that pieces doesn't just download the raw model, but rather how the integration works.

@mark-at-pieces might be able to talk about this further.

@pieces-support-bot
Copy link

Hi @ekuznetsov139 - Thank you so much for creating this issue. Your issue has been automatically triaged and routed to the proper Pieces team member. Look for a follow-up within the next 24 hours.

Based on the details you provided, you might find these related issues helpful:

In the meantime, please check out these helpful resources:

We appreciate your patience and contribution to making Pieces better!

@pieces-support-bot pieces-support-bot bot added bug Something isn't working os:linux app:pieces os status:triaged The ticket has been reviewed, prioritized, categorized, and assigned to the appropriate team member. labels Dec 16, 2024
@brian-at-pieces
Copy link

Hey @ekuznetsov139 sorry you're having this issue. The reason we delete the model on a failed download is becuase we're unable to determine whether the partial download was corrupted or not, so we had to fully delete to ensure success.

Nice try on updating the db property, but that won't be enough. Try this:

  1. Shutdown Pieces
  2. Add an empty "download_success.txt" to the mistral directory in com.pieces.os/llm_chat_models Also check that your directory structure looks like the attached screenshot
  3. Restart Pieces

We actually have a new release coming the next day or two that completely overhauls our on-device LLM infrastructure - including the ability to restart a failed download from where it left off. If you can hang on until then your problem will be resolved.

Image

@mark-at-pieces mark-at-pieces added the status:waiting_on_user_responses Additional information is required from the user to proceed. label Dec 16, 2024
@ekuznetsov139
Copy link
Author

ekuznetsov139 commented Dec 18, 2024

Thanks. I have been able to install Mistral-7B as per these instructions. Unfortunately, it did not work: the model showed as "active", but, if I tried to make any query, the app sat thinking for a minute or two and then Pieces OS crashed. For the period of time when it was thinking. I was observing zero GPU or GPU memory usage (I have a mobile 4070 with 8 GB of memory). I have Ubuntu 24 with all the latest updates, vulkan installed, etc. I looked through the logs but did not see anything relevant. I was able to get good responses from cloud LLMs, it was only the local LLM that was malfunctioning.

I tried to download a different model to see if the failure is specific to mistral (I saw an open ticket, issue #112 saying that that specific model had problems), but, before I could test it, Pieces got updated to the newest version (I guess that is the one you were referring to?) Now it can't use local models at all, because, if I go to the UI page, it asks to install ollama, and if I do that, it immediately says "Ollama failed to install".

I tried to uninstall and reinstall pieces-os, it did not help.

In the logs, I see the following:
Image

I can confirm that there is no ollama subfolder in /home/eugene/Documents/com.pieces.os/production.

Do you want a separate ticket for this issue?

@brian-at-pieces
Copy link

@ekuznetsov139 really sorry for all the trouble here. I've opened a new issue for the failure to install and I'm currently investigating. In the meantime, if you download Ollama yourself Pieces should be able to utilize it: https://ollama.com/download/linux

I'll get back to you ASAP. Thanks for the patience

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
app:desktop application app:pieces os bug Something isn't working os:linux status:triaged The ticket has been reviewed, prioritized, categorized, and assigned to the appropriate team member. status:waiting_on_user_responses Additional information is required from the user to proceed.
Projects
None yet
Development

No branches or pull requests

4 participants