[FIX_FOR_VLLM_LATEST] Fix the MultiModalKwargsItem to align with vllm changes#903
[FIX_FOR_VLLM_LATEST] Fix the MultiModalKwargsItem to align with vllm changes#903pawel-olejniczak wants to merge 27 commits intovllm-project:mainfrom
Conversation
🚧 CI BlockedThe main CI workflow was not started for the following reason:
|
1 similar comment
🚧 CI BlockedThe main CI workflow was not started for the following reason:
|
✅ CI PassedAll checks passed successfully against the following vllm commit: |
b04a971 to
a206336
Compare
🚧 CI BlockedThe main CI workflow was not started for the following reason:
|
8662fc1 to
82e3aa8
Compare
3c4cdc1 to
756c765
Compare
🚧 CI BlockedThe main CI workflow was not started for the following reason:
|
9ef6e39 to
5261273
Compare
🚧 CI BlockedThe main CI workflow was not started for the following reason:
|
Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai>
Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai>
fb778f3 to
44697ca
Compare
🚧 CI BlockedThe main CI workflow was not started for the following reason:
|
Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai>
… fixes – batch no. 1 (vllm-project#988) This PR contains part of fixes from vllm-project#903 --------- Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai> Signed-off-by: Rohit kumar Singh <rksingh@habana.ai>
… fixes – batch no. 1 (vllm-project#988) This PR contains part of fixes from vllm-project#903 --------- Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai>
🚧 CI BlockedThe main CI workflow was not started for the following reason:
|
Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai>
Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai>
Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai>
Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai>
🚧 CI BlockedThe main CI workflow was not started for the following reason:
|
Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai>
🚧 CI BlockedThe main CI workflow was not started for the following reason:
|
Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai>
… fixes – batch no. 2 (#998) This PR contains part of fixes from #903 Fixed issues: AttributeError: '_OpNamespace' '_moe_C' object has no attribute 'topk_softmax' AttributeError: 'HPUVocabParallelEmbeddingWithLoRA' object has no attribute 'quant_method' --------- Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai> Co-authored-by: Iryna Boiko <iryna.boiko@intel.com>
🚧 CI BlockedThe main CI workflow was not started for the following reason:
|
Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai>
🚧 CI BlockedThe main CI workflow was not started for the following reason:
|
Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai>
🚧 CI BlockedThe main CI workflow was not started for the following reason:
|
… fixes – batch no. 3 (#1053) This PR contains part of fixes from #903 Fixed issues: AttributeError: 'FusedMoE' object has no attribute 'forward_impl' AttributeError: 'PatchedMixtralMoE' object has no attribute 'is_internal_router' RuntimeError: Overloaded torch operator invoked from Python failed to match any schema TypeError: HpuPlatform.get_attn_backend_cls() got an unexpected keyword argument 'num_heads' TypeError: Request.__init__() got an unexpected keyword argument 'eos_token_id' KeyError: 'model_type' AttributeError: 'FusedMoE' object has no attribute 'dp_size'. Did you mean: 'ep_size'? AttributeError: 'SharedFusedMoE' object has no attribute 'use_dp_chunking' AttributeError: 'SharedFusedMoE' object has no attribute 'use_pplx_kernels' AttributeError: 'SharedFusedMoE' object has no attribute 'dp_size'. Did you mean: 'ep_size'? TypeError: HpuDeepseekOCRDummyInputsBuilder.get_dummy_mm_data() got an unexpected keyword argument 'mm_processor_kwargs' --------- Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai> Co-authored-by: Iryna Boiko <iryna.boiko@intel.com>
… fixes – batch no. 3 (vllm-project#1053) This PR contains part of fixes from vllm-project#903 Fixed issues: AttributeError: 'FusedMoE' object has no attribute 'forward_impl' AttributeError: 'PatchedMixtralMoE' object has no attribute 'is_internal_router' RuntimeError: Overloaded torch operator invoked from Python failed to match any schema TypeError: HpuPlatform.get_attn_backend_cls() got an unexpected keyword argument 'num_heads' TypeError: Request.__init__() got an unexpected keyword argument 'eos_token_id' KeyError: 'model_type' AttributeError: 'FusedMoE' object has no attribute 'dp_size'. Did you mean: 'ep_size'? AttributeError: 'SharedFusedMoE' object has no attribute 'use_dp_chunking' AttributeError: 'SharedFusedMoE' object has no attribute 'use_pplx_kernels' AttributeError: 'SharedFusedMoE' object has no attribute 'dp_size'. Did you mean: 'ep_size'? TypeError: HpuDeepseekOCRDummyInputsBuilder.get_dummy_mm_data() got an unexpected keyword argument 'mm_processor_kwargs' --------- Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai> Co-authored-by: Iryna Boiko <iryna.boiko@intel.com>
… fixes – batch no. 3 (vllm-project#1053) This PR contains part of fixes from vllm-project#903 Fixed issues: AttributeError: 'FusedMoE' object has no attribute 'forward_impl' AttributeError: 'PatchedMixtralMoE' object has no attribute 'is_internal_router' RuntimeError: Overloaded torch operator invoked from Python failed to match any schema TypeError: HpuPlatform.get_attn_backend_cls() got an unexpected keyword argument 'num_heads' TypeError: Request.__init__() got an unexpected keyword argument 'eos_token_id' KeyError: 'model_type' AttributeError: 'FusedMoE' object has no attribute 'dp_size'. Did you mean: 'ep_size'? AttributeError: 'SharedFusedMoE' object has no attribute 'use_dp_chunking' AttributeError: 'SharedFusedMoE' object has no attribute 'use_pplx_kernels' AttributeError: 'SharedFusedMoE' object has no attribute 'dp_size'. Did you mean: 'ep_size'? TypeError: HpuDeepseekOCRDummyInputsBuilder.get_dummy_mm_data() got an unexpected keyword argument 'mm_processor_kwargs' --------- Signed-off-by: Paweł Olejniczak <polejniczakx@habana.ai> Co-authored-by: Iryna Boiko <iryna.boiko@intel.com>
This PR contains fixes for upstream changes:
vllm-project/vllm#33331
vllm-project/vllm#32954
vllm-project/vllm#33362
vllm-project/vllm#33284
vllm-project/vllm#31541
vllm-project/vllm#33536