Describe the feature
make the new openAI recommended endpoint available in the router to streamline the interaction with the latest version of VLLM. openAI does recommend new projects to use the new /v1/responses endpoint. is someone already working on this? if not, happy to start working on this and submit a PR.
Why do you need this feature?
openAI does recommend new projects to use the new /v1/responses endpoint. its already available in VLLM but not yet in the router.
Additional context
No response