Skip to content

Conversation

@DarkLight1337
Copy link
Member

@DarkLight1337 DarkLight1337 commented Aug 22, 2025

This reverts commit 58eee5f.

Purpose

See #20000 (comment), this optimization is not necessary anymore

Test Plan

Test Result

(Optional) Documentation Update


Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.

…ist-to-list conversion (vllm-project#20000)"

This reverts commit 58eee5f.

Signed-off-by: DarkLight1337 <[email protected]>
@DarkLight1337 DarkLight1337 changed the title Revert "[PERF] Use faster way of decode in tokenizer: avoid useless l… Revert "[PERF] Use faster way of decode in tokenizer: avoid useless list-to-list conversion (#20000)" Aug 22, 2025
@DarkLight1337 DarkLight1337 requested review from 22quinn and houseroad and removed request for houseroad August 22, 2025 03:00
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request reverts a previous performance optimization in the decode_tokens function. The change is simple and reverts the code to a previous state. I have one suggestion to refactor the updated function to improve consistency with other utility functions in the same file, which will aid future maintainability.

Comment on lines 52 to +56
if skip_special_tokens is not None:
return decode_method(token_ids,
skip_special_tokens=skip_special_tokens)
return tokenizer.decode(token_ids,
skip_special_tokens=skip_special_tokens)

return decode_method(token_ids)
return tokenizer.decode(token_ids)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

For better consistency with the encode_tokens function that follows, consider refactoring this to use a kwargs dictionary. This approach makes the function's structure similar to its counterpart, improving the overall maintainability and readability of the file.

Suggested change
if skip_special_tokens is not None:
return decode_method(token_ids,
skip_special_tokens=skip_special_tokens)
return tokenizer.decode(token_ids,
skip_special_tokens=skip_special_tokens)
return decode_method(token_ids)
return tokenizer.decode(token_ids)
kwargs = {}
if skip_special_tokens is not None:
kwargs["skip_special_tokens"] = skip_special_tokens
return tokenizer.decode(token_ids, **kwargs)

@Isotr0py Isotr0py enabled auto-merge (squash) August 23, 2025 02:28
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Aug 23, 2025
Copy link
Collaborator

@22quinn 22quinn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@Isotr0py Isotr0py merged commit b4e9fd8 into vllm-project:main Aug 23, 2025
49 checks passed
@DarkLight1337 DarkLight1337 deleted the revert-20000 branch August 23, 2025 04:18
epwalsh pushed a commit to epwalsh/vllm that referenced this pull request Aug 28, 2025
xiao-llm pushed a commit to xiao-llm/vllm that referenced this pull request Aug 28, 2025
zhewenl pushed a commit to zhewenl/vllm that referenced this pull request Aug 28, 2025
mengxingkongzhouhan pushed a commit to mengxingkongzhouhan/vllm that referenced this pull request Aug 30, 2025
zhewenl pushed a commit to zhewenl/vllm that referenced this pull request Sep 3, 2025
ekagra-ranjan pushed a commit to ekagra-ranjan/vllm that referenced this pull request Sep 4, 2025
…ist-to-list conversion (vllm-project#20000)" (vllm-project#23396)

Signed-off-by: DarkLight1337 <[email protected]>
Signed-off-by: Ekagra Ranjan <[email protected]>
FeiDaLI pushed a commit to FeiDaLI/vllm that referenced this pull request Sep 25, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants