Skip to content

[Bug] Fix the OOM condition for CPU cache#260

Merged
zhuohan123 merged 1 commit intomainfrom
fix-oom-condition
Jun 26, 2023
Merged

[Bug] Fix the OOM condition for CPU cache#260
zhuohan123 merged 1 commit intomainfrom
fix-oom-condition

Conversation

@zhuohan123
Copy link
Copy Markdown
Member

@zhuohan123 zhuohan123 commented Jun 26, 2023

CPU cache size can be 0 since recomputation does not need to swap the memory to the CPU.

@zhuohan123 zhuohan123 requested a review from WoosukKwon June 26, 2023 14:42
Copy link
Copy Markdown
Collaborator

@WoosukKwon WoosukKwon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@zhuohan123 zhuohan123 merged commit 0b7db41 into main Jun 26, 2023
@zhuohan123 zhuohan123 deleted the fix-oom-condition branch June 29, 2023 17:25
michaelfeil pushed a commit to michaelfeil/vllm that referenced this pull request Jul 1, 2023
hongxiayang pushed a commit to hongxiayang/vllm that referenced this pull request Feb 13, 2024
mht-sharma pushed a commit to mht-sharma/vllm that referenced this pull request Dec 9, 2024
…aotriton so that the base image won't have it and we can rebuild torch (vllm-project#260)
cursor Bot pushed a commit to Shirley125/vllm_epd that referenced this pull request Jan 22, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants