[ROCm] Fix AttributeError for torch.compiler.skip_all_guards_unsafe on older PyTorch#37219
Conversation
…n older PyTorch Signed-off-by: Andreas Karatzas <akaratza@amd.com>
There was a problem hiding this comment.
Code Review
This pull request correctly addresses a compatibility issue with older PyTorch versions by adding a fallback for torch.compiler.skip_all_guards_unsafe. I've suggested a small improvement to make the implementation more concise and Pythonic by using getattr, which enhances code maintainability.
| if hasattr(torch.compiler, "skip_all_guards_unsafe"): | ||
| options["guard_filter_fn"] = torch.compiler.skip_all_guards_unsafe | ||
| else: | ||
| options["guard_filter_fn"] = lambda x: [False for _ in x] |
There was a problem hiding this comment.
This if/else block can be simplified by using getattr with a default value. This approach is more concise and Pythonic for checking for an attribute and providing a fallback, which improves code readability and maintainability.
| if hasattr(torch.compiler, "skip_all_guards_unsafe"): | |
| options["guard_filter_fn"] = torch.compiler.skip_all_guards_unsafe | |
| else: | |
| options["guard_filter_fn"] = lambda x: [False for _ in x] | |
| options["guard_filter_fn"] = getattr( | |
| torch.compiler, | |
| "skip_all_guards_unsafe", | |
| lambda x: [False for _ in x]) |
There was a problem hiding this comment.
I considered getattr but went with hasattr instead. It makes the version-dependent branching more explicit and easier to spot. Added comments to clarify the intent behind each path.
…n older PyTorch Signed-off-by: Andreas Karatzas <akaratza@amd.com>
…n older PyTorch Signed-off-by: Andreas Karatzas <akaratza@amd.com>
gpolovets1
left a comment
There was a problem hiding this comment.
Hi @AndreasKaratzas thanks for this fix! Would you be able to expedite merging, as it is breaking some downstream tests for TPU as well.
Thanks!
…n older PyTorch (vllm-project#37219) Signed-off-by: Andreas Karatzas <akaratza@amd.com>
…n older PyTorch (vllm-project#37219) Signed-off-by: Andreas Karatzas <akaratza@amd.com>
…n older PyTorch (vllm-project#37219) Signed-off-by: Andreas Karatzas <akaratza@amd.com>
…n older PyTorch (vllm-project#37219) Signed-off-by: Andreas Karatzas <akaratza@amd.com>
…n older PyTorch (vllm-project#37219) Signed-off-by: Andreas Karatzas <akaratza@amd.com>
torch.compiler.skip_all_guards_unsafe, which doesn't exist in PyTorch < 2.10hasattrcheck to fall back to the equivalentlambda x: [False for _ in x]on older versionsAttributeError: module 'torch.compiler' has no attribute 'skip_all_guards_unsafe'when running any model inferenceTest plan
pytest -s-v tests/test_regression.py::test_max_tokens_nonepytest -s -v tests/v1/entrypoints/llm/test_struct_output_generate.pycc @kenroche