Conversation
|
please fix the formatting |
WoosukKwon
left a comment
There was a problem hiding this comment.
This PR breaks the support for Phi-2, which uses layer_norm_epsilon instead of layer_norm_eps.
does using |
|
Hi @Aakash-Kaushik, can we do something like eps = getattr(config, "layer_norm_eps", None)
if eps is None:
eps = getattr(config, "layer_norm_epsilon", 1e-6) |
hey, i have done something like this but instead of the last assumption i left it to fail, assuming epsilon if missing is not a great idea, my code looks like. layer_norm_eps = getattr(PretrainedConfig, "layer_norm_eps", None)
if layer_norm_eps is None:
layer_norm_eps = PretrainedConfig.layer_norm_epsiloni am struggling with the format.sh script, if someone can format the code i can push it. |
|
I don't this this works? |
|
ok i have a fix but now is running into This is #2422 and I will push to this PR to fix it. Looks like Phi changed things... |
|
@simon-mo The Phi 2 and Phi-1.5 models were recently (after this PR) updated to be compatible with HF transformers. Now we need to update the model code. I can do this. |
|
I need to fix this to get CI working. Got it working now, running some tests and will ask for review. |
|
I fixed the variable names, but currently facing weights naming mismatch (phi renamed some weights). I will skip this test in CI first and come back to this. |
|
Superseded by #2428 |
|
Was this resolved? I am also getting the same error when running this: |
#2428 was merged to the main branch, there has been no release after that. You can use that branch with the fix |
Closes #2422
Facing the error in:
vllm-openai docker image 0.2.7
It is infact
layer_norm_epseven in the phi1.5 config on hugging face