feature: introducing batch
tool
#2983
Open
+229
−39
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
This PR adds a new tool to opencode:
batch
Explainer
Lately I've been trying to make parallel tool calling work reliably with different models using prompt engineering, but it has shown to be the wrong approach. By simply exposing a
batch
tool and with some light prompting (AGENTS.md of 2nd commit), every model I use (gh/gpt-5, gh/sonnet-4.5, zai/glm-4.6) have seen dramatic increase in likely hood of executing parallel tool calling.I still find myself having to say "use batch" from time to time but overall, it's been great.
Notes
There's some edits to be made in the descriptions of some other tools (to align the whole tool set with batch) - this will be done in another PR.
Numbers
I haven't actually made legit benchmarks to test the real efficiency gain from this PR, but after using it for a minute, I can see a substantial decrease in average completion time of user requests. This PR is likely beneficial for provider rate limiting (1 request instead of 3++ over the course of a working day do pile up...)