-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Could I convert lightning module to onnx? Thanks! #2271
Comments
Hi! thanks for your contribution!, great first issue! |
good point, @PyTorchLightning/core-contributors thoughts? |
From the docs https://pytorch.org/docs/stable/onnx.html it seems easy enough. We would require users to define the Do we already have a method for saving the model that can be extended? |
a lightningModule is just a nn.Module. Wouldn't this just work? # Input to the model
x = torch.randn(batch_size, 1, 224, 224, requires_grad=True)
torch_out = torch_model(x)
# Export the model
torch.onnx.export(torch_model, # model being run
x, # model input (or a tuple for multiple inputs)
"super_resolution.onnx", # where to save the model (can be a file or But we should just make this automatic... model = LitModel(...)
model.to_onnx(x) |
@williamFalcon I could take up this issue. batch = next(iter(model.train_dataloader()))
input_data = batch[0]
torch.onnx.export(model, input_data, file_path) |
@lezwon from which PT version is there this option? |
@Borda Going by the docs, I think ONNX export is supported since 0.3.0 in PyTorch. |
Could one also use LightningModules |
@awaelchli Didn't know about |
Yes so far the example_input_array is used for the summary printing input and output shapes. It is optional but if the user wants the full summary they need to define it. The reason why the user needs to define it manually is
To me it looks like exporting to onnx is very similar to the model summary in this regard. |
It does seem very similar. So the way I see it, the
|
I like it! 👍 |
Cool. I'll get working on it 😊 |
@lezwon Can you please make sure to allow additional kwargs that are passed to the onnx-export? These would come handy for customisation :) |
Will do :) |
🚀 Feature
pytorch Lightning works very good, but I cannot find any comments or examples to guide my convert to onnx from a pretrained lightning model, doesn't lightning module only use for researching purpose, without support of onnx cross platform?
The text was updated successfully, but these errors were encountered: