Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dalai llama 7B crashed on first request #432

Open
Aleksei-Badyaev opened this issue May 4, 2023 · 10 comments
Open

dalai llama 7B crashed on first request #432

Aleksei-Badyaev opened this issue May 4, 2023 · 10 comments

Comments

@Aleksei-Badyaev
Copy link

$ npx dalai serve
mkdir /home/user/dalai
Created custom directory: /home/user/dalai/config/prompts
Server running on http://localhost:3000/
> query: { method: 'installed', models: [] }
modelsPath /home/user/dalai/alpaca/models
{ modelFolders: [] }
modelsPath /home/user/dalai/llama/models
{ modelFolders: [ '7B' ] }
exists 7B
> query: {
  seed: -1,
  threads: 4,
  n_predict: 200,
  top_k: 40,
  top_p: 0.9,
  temp: 0.8,
  repeat_last_n: 64,
  repeat_penalty: 1.3,
  debug: false,
  models: [ 'llama.7B' ],
  prompt: 'Who are you?',
  id: 'TS-1683201361270-27393'
}
/home/user/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:219
    let [Core, Model] = req.model.split(".")
                                  ^

TypeError: Cannot read properties of undefined (reading 'split')
    at Dalai.query (/home/user/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:219:35)
    at Socket.<anonymous> (/home/user/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:534:20)
    at Socket.emit (node:events:511:28)
    at Socket.emitUntyped (/home/user/.npm/_npx/3c737cbb02d79cc9/node_modules/socket.io/dist/typed-events.js:69:22)
    at /home/user/.npm/_npx/3c737cbb02d79cc9/node_modules/socket.io/dist/socket.js:703:39
    at process.processTicksAndRejections (node:internal/process/task_queues:77:11)

Node.js v20.0.0
@joelduerksen
Copy link

more than a few people are seeing this. I tried both the docker and npm versions, no joy. fyi, there are multiple tickets open for this issue. also see
#334
looks like fix was identified, but not merged, yet
#348

@kartik-madhak
Copy link

Is it related to this error somehow?
/root/dalai/alpaca/main --seed -1 --threads 4 --n_predict 200 --model models/7B/ggml-model-q4_0.bin --top_k 40 --top_p 0.9 --temp 0.8 --repeat_last_n 64 --repeat_penalty 1.3 -p "The expected response for a highly intelligent chatbot to ">PROMPT" is
""
exit
root@22d62ba0ce90:~/dalai/alpaca# /root/dalai/alpaca/main --seed -1 --threads 4 --n_predict 200 --model models/7B/ggml-model-q4_0.bin --top_k 40 --top_p 0.9 --temp 0.8 --repeat_last_n 64 --repeat_penalty 1.3 -p "The expected response for a highly intelligent chatbot to ">PROMPT" is

""
main: seed = 1683224155
llama_model_load: loading model from 'models/7B/ggml-model-q4_0.bin' - please wait ...
llama_model_load: invalid model file 'models/7B/ggml-model-q4_0.bin' (bad magic)
main: failed to load model from 'models/7B/ggml-model-q4_0.bin'
root@22d62ba0ce90:~/dalai/alpaca# exit
exit

image

@SpaceTimeEvent
Copy link

Is it related to this error somehow? /root/dalai/alpaca/main --seed -1 --threads 4 --n_predict 200 --model models/7B/ggml-model-q4_0.bin --top_k 40 --top_p 0.9 --temp 0.8 --repeat_last_n 64 --repeat_penalty 1.3 -p "The expected response for a highly intelligent chatbot to ">PROMPT" is "" exit root@22d62ba0ce90:~/dalai/alpaca# /root/dalai/alpaca/main --seed -1 --threads 4 --n_predict 200 --model models/7B/ggml-model-q4_0.bin --top_k 40 --top_p 0.9 --temp 0.8 --repeat_last_n 64 --repeat_penalty 1.3 -p "The expected response for a highly intelligent chatbot to ">PROMPT" is

""
main: seed = 1683224155
llama_model_load: loading model from 'models/7B/ggml-model-q4_0.bin' - please wait ...
llama_model_load: invalid model file 'models/7B/ggml-model-q4_0.bin' (bad magic)
main: failed to load model from 'models/7B/ggml-model-q4_0.bin'
root@22d62ba0ce90:~/dalai/alpaca# exit
exit

image

The model file is corrupted or incompatible. Download new file from here:
https://huggingface.co/Pi3141/alpaca-native-7B-ggml/blob/397e872bf4c83f4c642317a5bf65ce84a105786e/ggml-model-q4_0.bin

Then replace the file in this path:
alpaca/models/7B/ggml-model-q4_0.bin

Now you can try to launch the server again:
npx dalai serve

If it won't work, clean install everything
npx clear-npx-cache
npm cache clean --force
sudo apt-get update -y
sudo apt-get upgrade -y
node -v # should be 18.16.0
source ~/.bashrc
npx dalai alpaca install 7B
(then replace the file again)

@Aleksei-Badyaev
Copy link
Author

What about the other models? Are they all "damaged" as well? Where to download working LLAMA models?

@SpaceTimeEvent
Copy link

What about the other models? Are they all "damaged" as well? Where to download working LLAMA models?

I honestly don't know, i have only 10 yr old 6gb ram laptop, and managed to get up and running only 7b model with file from huggingface website, you can try other model files from there: https://huggingface.co/Pi3141

@ujj
Copy link

ujj commented May 8, 2023

Can confirm the latest huggingface alpaca 7b model linked in @SpaceTimeEvent's comment above works perfectly fine 👍🏾

@zuluana
Copy link

zuluana commented May 9, 2023

I'm having this issue too. Why are the models included with this package corrupt?

@Grislard
Copy link

Grislard commented May 9, 2023

How do I delete the entire install of my local install?
want to start over from scratch with an all clean install.

@koosty
Copy link

koosty commented May 14, 2023

clearing my browser local storage works for me.

@xero-q
Copy link

xero-q commented Jul 7, 2023

more than a few people are seeing this. I tried both the docker and npm versions, no joy. fyi, there are multiple tickets open for this issue. also see #334 looks like fix was identified, but not merged, yet #348

I tried the solution in that PR and yet it doesn't work for me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants