Skip to content

[torch.compile] caching of config fields should be opt-out by default#26468

Merged
vllm-bot merged 42 commits intovllm-project:mainfrom
vnadathur:envhashing
Nov 19, 2025
Merged

[torch.compile] caching of config fields should be opt-out by default#26468
vllm-bot merged 42 commits intovllm-project:mainfrom
vnadathur:envhashing

Conversation

@vnadathur
Copy link
Contributor

@vnadathur vnadathur commented Oct 9, 2025

Referring: #23107

Implements opt-out system of hashing, suggested by #16501

def compute_hash(self):
   factors = list(self.__dict__.values())
   factors.remove("enforce_eager")
   factors.remove("tokenizer_config")
   ...

@github-actions
Copy link

github-actions bot commented Oct 9, 2025

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors.

You ask your reviewers to trigger select CI tests on top of fastcheck CI.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

If you have any questions, please reach out to us on Slack at https://slack.vllm.ai.

🚀

@ProExpertProg ProExpertProg linked an issue Oct 9, 2025 that may be closed by this pull request
1 task
Copy link
Collaborator

@ProExpertProg ProExpertProg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is much cleaner than I last remember! Nice work

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

vnadathur and others added 5 commits October 10, 2025 13:04
reposting this from old pr. implements an opt-out system of hashing for env vars.

Signed-off-by: vnadathur <glvikramn@gmail.com>
Signed-off-by: WorldExplored <srreyansh.sethi@gmail.com>

Signed-off-by: vnadathur <glvikramn@gmail.com>
Co-Authored-By: Srreyansh Sethi <107075589+worldexplored@users.noreply.github.com>
Co-Authored-By: vnadathur <236933696+vnadathur@users.noreply.github.com>
- Concated the lazily.
- refactored comments.
- updated ignored factors
- logged erors
- persists if files don't already exist

Srreyansh Sethi <srreyansh.sethi@gmail.com>

Co-Authored-By: vnadathur <236933696+vnadathur@users.noreply.github.com>
Signed-off-by: WorldExplored <srreyansh.sethi@gmail.com>
Added lazy logging util, called in backends.py

Signed-off-by: vnadathur <glvikramn@gmail.com>
Fixed config hashing and dataclass hashing

Signed-off-by: Srreyansh Sethi <srreyansh.sethi@gmail.com>
Co-Authored-By: vnadathur <236933696+vnadathur@users.noreply.github.com>
Signed-off-by: WorldExplored <srreyansh.sethi@gmail.com>
WorldExplored and others added 5 commits October 10, 2025 16:06
Signed-off-by: Srreyansh Sethi <107075589+WorldExplored@users.noreply.github.com>
Signed-off-by: vnadathur <glvikramn@gmail.com>
Signed-off-by: WorldExplored <srreyansh.sethi@gmail.com>
Co-Authored-By: vnadathur <236933696+vnadathur@users.noreply.github.com>
Signed-off-by: Srreyansh Sethi <107075589+WorldExplored@users.noreply.github.com>
Signed-off-by: Srreyansh Sethi <107075589+WorldExplored@users.noreply.github.com>
Copy link
Collaborator

@zou3519 zou3519 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, just some minor nits

vnadathur and others added 2 commits November 18, 2025 14:22
Signed-off-by: vnadathur <glvikramn@gmail.com>
Co-Authored-By: Srreyansh Sethi <107075589+WorldExplored@users.noreply.github.com>
Signed-off-by: WorldExplored <srreyansh.sethi@gmail.com>
Co-Authored-By: vnadathur <236933696+vnadathur@users.noreply.github.com>
@ProExpertProg ProExpertProg enabled auto-merge (squash) November 18, 2025 22:41
Signed-off-by: vnadathur <glvikramn@gmail.com>
auto-merge was automatically disabled November 19, 2025 00:24

Head branch was pushed to by a user without write access

@ProExpertProg ProExpertProg enabled auto-merge (squash) November 19, 2025 01:19
@vllm-bot vllm-bot merged commit 1ffe934 into vllm-project:main Nov 19, 2025
126 of 131 checks passed
@github-project-automation github-project-automation bot moved this from In review to Done in torch.compile integration Nov 19, 2025
@vnadathur vnadathur deleted the envhashing branch November 19, 2025 21:45
factors = [env_hash, config_hash, code_hash, compiler_hash]
# Use SHA-256 for cache key hashing to be consistent across
# compute_hash functions. Truncate for a short cache dir name.
hash_key = hashlib.sha256(str(factors).encode()).hexdigest()[:10]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we have used the hash_factors method here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will add to this pr: #29117

devpatelio pushed a commit to SumanthRH/vllm that referenced this pull request Nov 29, 2025
…vllm-project#26468)

Signed-off-by: vnadathur <glvikramn@gmail.com>
Signed-off-by: WorldExplored <srreyansh.sethi@gmail.com>
Signed-off-by: Srreyansh Sethi <srreyansh.sethi@gmail.com>
Signed-off-by: Srreyansh Sethi <107075589+WorldExplored@users.noreply.github.com>
Co-authored-by: WorldExplored <srreyansh.sethi@gmail.com>
Co-authored-by: Srreyansh Sethi <107075589+worldexplored@users.noreply.github.com>
Co-authored-by: vnadathur <236933696+vnadathur@users.noreply.github.com>
Co-authored-by: Luka Govedič <ProExpertProg@users.noreply.github.com>
kitaekatt pushed a commit to kitaekatt/vllm that referenced this pull request Dec 1, 2025
…vllm-project#26468)

Signed-off-by: vnadathur <glvikramn@gmail.com>
Signed-off-by: WorldExplored <srreyansh.sethi@gmail.com>
Signed-off-by: Srreyansh Sethi <srreyansh.sethi@gmail.com>
Signed-off-by: Srreyansh Sethi <107075589+WorldExplored@users.noreply.github.com>
Co-authored-by: WorldExplored <srreyansh.sethi@gmail.com>
Co-authored-by: Srreyansh Sethi <107075589+worldexplored@users.noreply.github.com>
Co-authored-by: vnadathur <236933696+vnadathur@users.noreply.github.com>
Co-authored-by: Luka Govedič <ProExpertProg@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed ready-run-all-tests Trigger CI with all tests for wide-ranging PRs torch.compile

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

[RFC]: vLLM x torch.compile caching should be opt-out by default

8 participants