Skip to content

[Frontend] Customizable RoPE theta#5197

Merged
simon-mo merged 4 commits intovllm-project:mainfrom
sasha0552:dynamic-theta
Jun 11, 2024
Merged

[Frontend] Customizable RoPE theta#5197
simon-mo merged 4 commits intovllm-project:mainfrom
sasha0552:dynamic-theta

Conversation

@sasha0552
Copy link
Copy Markdown
Contributor

Changing the RoPE theta in some cases improves the performance of the scaled model. Since we already have an argument for RoPE scaling, it is a good idea to have an argument for RoPE theta as well.

image

Source

@sasha0552 sasha0552 mentioned this pull request Jun 9, 2024
2 tasks
Copy link
Copy Markdown
Collaborator

@simon-mo simon-mo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall this looks good. Please fix the readability issue.

Comment thread vllm/transformers_utils/config.py Outdated
@simon-mo simon-mo merged commit dcbf428 into vllm-project:main Jun 11, 2024
@sasha0552 sasha0552 deleted the dynamic-theta branch June 11, 2024 18:19
robertgshaw2-redhat pushed a commit to neuralmagic/nm-vllm that referenced this pull request Jun 12, 2024
joerunde pushed a commit to joerunde/vllm that referenced this pull request Jun 17, 2024
xjpang pushed a commit to xjpang/vllm that referenced this pull request Jun 27, 2024
xjpang pushed a commit to xjpang/vllm that referenced this pull request Jul 8, 2024
xjpang pushed a commit to xjpang/vllm that referenced this pull request Jul 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants