Skip to content

[Bug]: index out of bound for logits_processors cause vllm.engine.async_llm_engine.AsyncEngineDeadError #6866

@FeiDeng

Description

@FeiDeng

Your current environment

use the Vllm with phi3 model "microsoft/Phi-3-mini-128k-instruct"

🐛 Describe the bug

The issue I think is user could give a token id which exceed the vocab size. And then this error cause the vllm.engine.async_llm_engine.AsyncEngineDeadError. The following sample actually not well decribe the issue.

please check this call stack"
0] File "/usr/local/lib/python3.10/dist-packages/vllm/model_executor/layers/logits_processor.py", line 59, in forward
0] logits = _apply_logits_processors(logits, sampling_metadata)
0] File "/usr/local/lib/python3.10/dist-packages/vllm/model_executor/layers/logits_processor.py", line 116, in _apply_logits_processors
0] logits_row = logits_processor(past_tokens_ids,
0] File "/usr/local/lib/python3.10/dist-packages/vllm/entrypoints/openai/protocol.py", line 245, in logit_bias_logits_processor
0] logits[int(token_id)] += bias
0] IndexError: index 55434 is out of bounds for dimension 0 with size 32064
0] The above exception was the direct cause of the following exception:
0] Traceback (most recent call last):
0] File "/usr/local/lib/python3.10/dist-packages/vllm/engine/async_llm_engine.py", line 54, in _log_task_completion
0] raise AsyncEngineDeadError(
0] vllm.engine.async_llm_engine.AsyncEngineDeadError: Task finished unexpectedly. This should never happen! Please open an issue on Github. See stack trace above for theactual cause.

We may need better error handling. This issue cause some container to error state

Metadata

Metadata

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions