You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I need to extend the context length of gemma2-9b model along also with other models like llama3.1-8b
can we do it with ROPE SCALING? if so how to use these args --rope-scaling & --rope-theta ?
plus does these configs has different things to considered for rope scaling? I need to extend up to 128k tokens.
Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
The text was updated successfully, but these errors were encountered:
Your current environment
How would you like to use vllm
I need to extend the context length of gemma2-9b model along also with other models like llama3.1-8b
can we do it with ROPE SCALING? if so how to use these args
--rope-scaling
&--rope-theta
?plus does these configs has different things to considered for rope scaling? I need to extend up to 128k tokens.
$ cat models--google--gemma-2-9b-it/config.json
***$ cat NousResearch--Meta-Llama-3-8B-Instruct/config.json ***
Before submitting a new issue...
The text was updated successfully, but these errors were encountered: