Skip to content

use skip_all_guards_unsafe to drop global_state and torch_function_mode_stack guards instead of previous hacks#36204

Merged
zou3519 merged 3 commits intovllm-project:mainfrom
laithsakka:drop_guards
Mar 16, 2026
Merged

use skip_all_guards_unsafe to drop global_state and torch_function_mode_stack guards instead of previous hacks#36204
zou3519 merged 3 commits intovllm-project:mainfrom
laithsakka:drop_guards

Conversation

@laithsakka
Copy link
Contributor

@laithsakka laithsakka commented Mar 6, 2026

Purpose

title.

Test Plan

tlparse with VLLM_USE_BYTECODE_HOOK=0
Screenshot 2026-03-05 at 7 15 23 PM

Signed-off-by: Laith Sakka <lsakka@meta.com>
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request refactors the guard handling mechanism in torch.compile wrappers. It replaces a manual, hacky approach of monkey-patching torch._C._dynamo.guards.GuardManager to disable GLOBAL_STATE and TORCH_FUNCTION_MODE_STACK guards. The new implementation leverages the public torch.compiler.skip_all_guards_unsafe API, which is a cleaner and more robust way to achieve the same goal of dropping all guards during compilation. This change simplifies the _compilation_context and makes the code more maintainable and less dependent on internal PyTorch APIs. The changes look correct and are a good improvement.

Copy link
Contributor

@zhxchen17 zhxchen17 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@zou3519 zou3519 added the ready ONLY add when PR is ready to merge/full CI is needed label Mar 14, 2026
@zou3519 zou3519 enabled auto-merge (squash) March 14, 2026 02:21
@zou3519 zou3519 merged commit 52131f8 into vllm-project:main Mar 16, 2026
55 checks passed
@AndreasKaratzas
Copy link
Collaborator

This PR is not ROCm compatible as ROCm is running on torch 2.9 version. The regression is addressed here: #37219

wendyliu235 pushed a commit to wendyliu235/vllm-public that referenced this pull request Mar 18, 2026
…de_stack guards instead of previous hacks (vllm-project#36204)

Signed-off-by: Laith Sakka <lsakka@meta.com>
fxdawnn pushed a commit to fxdawnn/vllm that referenced this pull request Mar 19, 2026
…de_stack guards instead of previous hacks (vllm-project#36204)

Signed-off-by: Laith Sakka <lsakka@meta.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants