Throw better error for when running into k8s service discovery issue#18209
Throw better error for when running into k8s service discovery issue#18209vllm-bot merged 1 commit intovllm-project:mainfrom
Conversation
Signed-off-by: Will Eaton <weaton@redhat.com>
|
👋 Hi! Thank you for contributing to the vLLM project. 💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels. Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging. To run CI, PR reviewers can either: Add 🚀 |
…llm-project#18209) Signed-off-by: Will Eaton <weaton@redhat.com> Signed-off-by: Yuqi Zhang <yuqizhang@google.com>
The urllib.parse.urlparse() function itself does not raise an exception during the parsing of a URL. However, certain attributes of the returned ParseResult object (e.g., .port) may raise a ValueError when accessed if they contain invalid values. Issue introduced in vllm-project#18209 FIX: vllm-project#18617 Signed-off-by: rabi <ramishra@redhat.com>
Add a better error message if a user is likely running into the k8s service discovery issue: https://docs.vllm.ai/en/stable/serving/env_vars.html#environment-variables
I lost 20 minutes on this due to the default
int() -> ValueErrormessage , I think a better one would help.