-
Notifications
You must be signed in to change notification settings - Fork 127
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Does Microsoft.ML.OnnxRuntimeGenAI.Cuda (version 0.4.0) support Phi-3.5 Vision Onnx format? #943
Comments
The Phi-3 vision and Phi-3.5 vision models are split into three separate ONNX models: a vision component, an embedding component, and a text component. The According to your error, the vision component cannot be found. Can you check your Please note that re-designed ONNX models for Phi-3 vision and Phi-3.5 vision will be published to enable multi-image support. |
Hi @MaxAkbar, did you check your model directory for the files that Kunal described above? |
I just noticed that the file sizes are way too small, so something failed during the conversion :(. Has anyone been able to convert the vision into onnx format? I did look at the output but nothing jumped out at me as an error. I had a thread here about how to convert to onnx: microsoft/Phi-3CookBook#187 |
Thank you @MaxAkbar, would you be able to attach the output from the |
The new Phi-3 vision and Phi-3.5 vision ONNX models have now been released. The new models support no-image, single-image, and multi-image scenarios. |
Describe the bug
After migrating Phi-3.5-vision-instruct to Onnx format I am not able to use the NuGet package Microsoft.ML.OnnxRuntimeGenAI.Cuda version 0.4.0 to load the Onnx model. When referencing the folder where the Onnx model is I get an error that file not found.
To Reproduce
Steps to reproduce the behavior:
Microsoft.ML.OnnxRuntimeGenAI.OnnxRuntimeGenAIException
Expected behavior
The expected behavior is to have the model loaded and be able to run inference.
Desktop (please complete the following information):
Additional context
I have converted the Phi-3.5-mini-instruct to Phi-3.5-mini-instruct-cuda-fp32-onnx and able to run it without any issues.
The text was updated successfully, but these errors were encountered: