[Doc][CI/Build] Update docs and tests to use vllm serve#6431
[Doc][CI/Build] Update docs and tests to use vllm serve#6431simon-mo merged 12 commits intovllm-project:mainfrom DarkLight1337:vllm-serve-docs
vllm serve#6431Conversation
|
👋 Hi! Thank you for contributing to the vLLM project. Full CI run is still required to merge this PR so once the PR is ready to go, please make sure to run it. If you need all test signals in between PR commits, you can trigger full CI as well. To run full CI, you can do one of these:
🚀 |
|
I want to hold this off until the release. So people visiting the nightly docs can directly use the CLI |
|
In the meantime, please feel free to start improving it!!! |
vllm servevllm serve
mgoin
left a comment
There was a problem hiding this comment.
Does --model still work or will it cause issues with vllm serve? I'm curious if it is sufficient to replace python -m vllm.entrypoints.openai.api_server with vllm serve or if it specifically replaces python -m vllm.entrypoints.openai.api_server --model
|
mgoin
left a comment
There was a problem hiding this comment.
Thanks for the clarity, this is a nice PR to update usage LGTM!
|
@simon-mo since |
…ct#6431) Signed-off-by: Alvant <alvasian@yandex.ru>
…ct#6431) Signed-off-by: LeiWang1999 <leiwang1999@outlook.com>
Follow-up to #5090. As a sanity check, I have also updated the entrypoints tests to use the new CLI.
After this, we can update the Docker images and performance benchmarks to use the new CLI.
cc @EthanqX