Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Relax] Batch norm correctness on eval mode #17752

Draft
wants to merge 33 commits into
base: main
Choose a base branch
from

Conversation

hugolatendresse
Copy link
Contributor

@hugolatendresse hugolatendresse commented Mar 16, 2025

Batch_norm is a different operator in training and eval. The previous interface defaulted to the training mode and required changing an ingested pytorch program itself to use the eval mode. This is sub-ideal, especially since torch.export explicitely communicates whether batch_norm should be in training or eval in a given torch program.

This PR automates the selection of training/eval mode in the exported program translator, and achieves correctness for eval mode.

Future TODO: there is something wrong with batch_norm on training mode. It does not pass a correctness test when taken straight from the main branch (there's an issue with tensor dimensions). I added a note to address later as training mode is probably not high priority.

@hugolatendresse hugolatendresse changed the title [Relax] Fix batch norm ingestion [Relax] Batch norm correctness on eval mode Mar 16, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant