[Frontend][Responses API] Function call streaming#1
[Frontend][Responses API] Function call streaming#1madskildegaard wants to merge 4 commits intochaunceyjiang:func_callfrom
Conversation
Signed-off-by: madskildegaard <mkildegaard99@gmail.com>
Signed-off-by: madskildegaard <mkildegaard99@gmail.com>
Signed-off-by: madskildegaard <mkildegaard99@gmail.com>
|
👋 Hi! Thank you for contributing to the vLLM project. 💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels. Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run You ask your reviewers to trigger select CI tests on top of Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging. To run CI, PR reviewers can either: Add If you have any questions, please reach out to us on Slack at https://slack.vllm.ai. 🚀 |
|
@chaunceyjiang Hey, I used your branch and found out I needed streaming support for function calls, so I tried implementing it on top of your branch. Feel free to take a look and suggest changes if you want. |
|
Thank you, @madskildegaard, for your excellent work! |
Signed-off-by: madskildegaard <mkildegaard99@gmail.com>
|
This pull request has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this pull request should remain open. Thank you! |
|
This pull request has been automatically closed due to inactivity. Please feel free to reopen if you intend to continue working on it. Thank you! |
Adding streaming support for function calling in Responses API.
Refer to OpenAI responses function calling docs