We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Describe the bug Genny using onnxruntimegenai.cuda package fails to load https://huggingface.co/microsoft/Phi-3-mini-128k-instruct-onnx
onnxruntimegenai.cuda
https://huggingface.co/microsoft/Phi-3-mini-128k-instruct-onnx
To Reproduce Steps to reproduce the behavior:
Debug_Cuda
C:\Downloads\models\Phi-3-mini-128k-instruct-onnx\cuda\cuda-int4-rtn-block-32
C:\Downloads\models\Phi-3-mini-128k-instruct-onnx\cuda\cuda-fp16
Expected behavior
Is this model supported?
Screenshots
Desktop (please complete the following information):
The text was updated successfully, but these errors were encountered:
I had the same issue, I think it's an issue with the fp16. I can run it on Phi-3.5-mini-instruct-cuda-fp32-onnx.
Sorry, something went wrong.
No branches or pull requests
Describe the bug
Genny using
onnxruntimegenai.cuda
package fails to loadhttps://huggingface.co/microsoft/Phi-3-mini-128k-instruct-onnx
To Reproduce
Steps to reproduce the behavior:
Debug_Cuda
configuration selected.C:\Downloads\models\Phi-3-mini-128k-instruct-onnx\cuda\cuda-int4-rtn-block-32
orC:\Downloads\models\Phi-3-mini-128k-instruct-onnx\cuda\cuda-fp16
Expected behavior
Is this model supported?
Screenshots
Desktop (please complete the following information):
The text was updated successfully, but these errors were encountered: