-
Notifications
You must be signed in to change notification settings - Fork 233
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] chatglm-6b模型推理服务报错 #121
Labels
bug
Something isn't working
Comments
@yeliang2258 目前应该还没有支持 chatglm6, 只支持了 chatglm2-6b. |
好的,请问一下,我提供的这个模型,有支持计划吗? |
@yeliang2258 这个老的chatglm-6b 是效果很好吗,为啥不用新的chatglm2-6b呀。 |
你好,在加载chatglm2-6b模型的时候,报错:AttributeError: 'ChatGLM2PreAndPostLayerWeight' object has no attribute 'wte_weight_' |
@oreo-lp 我们没能复现,我估计你的权重没有下全导致的。 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
lightllm commit id:718e6d6dfffc75e7bbfd7ea80ba4afb77aa27726
huggingfaced下载的chatglm-6b模型,启动服务的时候报错。
模型下载链接:https://huggingface.co/THUDM/chatglm-6b
服务启动命令:python -m lightllm.server.api_server --model_dir THUDM/chatglm-6b --host 0.0.0.0 --port 8100 --tp 1 --max_total_token_num 120000 --tokenizer_mode auto --trust_remote_code
报错信息:
################
load model error: 'ffn_hidden_size' 'ffn_hidden_size' <class 'KeyError'>
File "/lightllm/lightllm/models/chatglm2/layer_weights/transformer_layer_weight.py", line 70, in load_ffn_weights
intermediate_size =self.network_config['ffn_hidden_size'] * 2
KeyError: 'ffn_hidden_size'
辛苦看看是什么原因导致的呢?谢谢
The text was updated successfully, but these errors were encountered: