Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

如何指定模型的数据类型为f16 #1846

Closed
Yang-bug-star opened this issue Jun 25, 2024 · 9 comments · Fixed by #2473
Closed

如何指定模型的数据类型为f16 #1846

Yang-bug-star opened this issue Jun 25, 2024 · 9 comments · Fixed by #2473
Assignees

Comments

@Yang-bug-star
Copy link

No description provided.

@lvhan028 lvhan028 self-assigned this Jun 25, 2024
@lvhan028
Copy link
Collaborator

目前需要修改模型 config.json 中的 torch_dtype
在接口或者工具中指定数据类型的功能,我们放在7月份支持

@lerogo
Copy link

lerogo commented Jul 17, 2024

目前需要修改模型 config.json 中的 torch_dtype 在接口或者工具中指定数据类型的功能,我们放在7月份支持

你好,能不能指定加载模型的config.json文件的地址?这样就可以配置模型了;
目前的方式似乎没法去配置模型的大部分参数

加一个 --model_config_path xxxxx

@lvhan028
Copy link
Collaborator

不打算这么做。如果想修改 config.json,直接修改 model_path下的config.json不就可以了么

@Jintao-Huang
Copy link

您好,请问torch_dtype有计划支持么

@lvhan028
Copy link
Collaborator

lvhan028 commented Aug 12, 2024

这个feature被我们落下了。
8月份底的版本估计加不进来,有其他重要的feature在开发和review中。
9月份的版本,我们排一下

@demoninpiano
Copy link

感谢感谢,目前蛮需要这个功能的

@XYZliang
Copy link

这个feature被我们落下了。 8月份底的版本估计加不进来,有其他重要的feature在开发和review中。 9月份的版本,我们排一下

还希望尽快支持,昇腾不少 NPU 例如 910系列是不支持 BF16 的

/home/ma-user/anaconda3/envs/lmdeploy/lib/python3.10/site-packages/torch_npu/contrib/transfer_to_npu.py:211: ImportWarning: 
    *************************************************************************************************************
    The torch.Tensor.cuda and torch.nn.Module.cuda are replaced with torch.Tensor.npu and torch.nn.Module.npu now..
    The torch.cuda.DoubleTensor is replaced with torch.npu.FloatTensor cause the double type is not supported now..
    The backend in torch.distributed.init_process_group set to hccl now..
    The torch.cuda.* and torch.cuda.amp.* are replaced with torch.npu.* and torch.npu.amp.* now..
    The device parameters have been replaced with npu in the function below:
    torch.logspace, torch.randint, torch.hann_window, torch.rand, torch.full_like, torch.ones_like, torch.rand_like, torch.randperm, torch.arange, torch.frombuffer, torch.normal, torch._empty_per_channel_affine_quantized, torch.empty_strided, torch.empty_like, torch.scalar_tensor, torch.tril_indices, torch.bartlett_window, torch.ones, torch.sparse_coo_tensor, torch.randn, torch.kaiser_window, torch.tensor, torch.triu_indices, torch.as_tensor, torch.zeros, torch.randint_like, torch.full, torch.eye, torch._sparse_csr_tensor_unsafe, torch.empty, torch._sparse_coo_tensor_unsafe, torch.blackman_window, torch.zeros_like, torch.range, torch.sparse_csr_tensor, torch.randn_like, torch.from_file, torch._cudnn_init_dropout_state, torch._empty_affine_quantized, torch.linspace, torch.hamming_window, torch.empty_quantized, torch._pin_memory, torch.autocast, torch.load, torch.Generator, torch.Tensor.new_empty, torch.Tensor.new_empty_strided, torch.Tensor.new_full, torch.Tensor.new_ones, torch.Tensor.new_tensor, torch.Tensor.new_zeros, torch.Tensor.to, torch.nn.Module.to, torch.nn.Module.to_empty
    *************************************************************************************************************
    
  warnings.warn(msg, ImportWarning)
<frozen importlib._bootstrap>:914: ImportWarning: TEMetaPathFinder.find_spec() not found; falling back to find_module()
2024-09-14 14:51:31,443 - lmdeploy - ERROR - AssertionError: bf16 is not supported on your device
2024-09-14 14:51:31,444 - lmdeploy - ERROR - <Model> test failed!
Your device does not support `torch.bfloat16`. Try edit `torch_dtype` in `config.json`.
Note that this might have negative effect!
/home/ma-user/anaconda3/envs/lmdeploy/lib/python3.10/tempfile.py:860: ResourceWarning: Implicitly cleaning up <TemporaryDirectory '/tmp/tmpa7ngwwq6'>
  _warnings.warn(warn_message, ResourceWarning)

每次需要改config.json不是很优雅

@lvhan028
Copy link
Collaborator

好的
0.6.1 版本,我们会加上

@lvhan028
Copy link
Collaborator

#2473 is addressing this feature.
May give it a try.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants