You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Defer and stream offer fantastic primitives for incremental delivery on a single chunked request, and I'm looking forward to using them. I also think the new wire protocol is well-crafted to enable pretty broad use case scenarios, which is what I wanted to bring up here.
I'm interested in GraphQL having better capabilities for common patterns like pagination without needing to rely on bespoke interfaces and per-application implementation at the schema level. I believe the incremental delivery protocol offers an interesting avenue to such things. To solve for these, GraphQL could support generic "resumable" operations.
I'm writing this up to see if there's broader interest in exploring this idea further -- if this seems promising, I can put together a more formal proposal and/or prototype a simple client/server implementation to demonstrate the concept.
The @resumable directive can be placed on any field definition so long as the field takes a resume: ResumeToken argument. It can also be placed on any query field so long as the corresponding field is marked as resumable.
When both client and server have opted into resumable operations, the server will return additional metadata in the response following a similar pattern to the pending response field on @defer / @stream responses:
// request wire protocol{"variables": {"category": "Food"},"query": "..."// see query document above}// response wire protocol{"data": {"posts": [{"id": "1","text": "...","categories": ["Food"]},// ... 19 more posts]},"resumable": {"pending": [{"id": "0","path": ["posts"],"token": "..."// opaque token returned by the server}]}}
This response indicates that there are additional results for the posts field that can be fetched by resuming the operation. If the server evaluates the query and does not find additional results, it returns a "complete" entry in resumable instead of pending.
The token value is an opaque cursor that the server will use to determine how to resume the operation on a subsequent request. In the example above, the token might be the ID of the last post that was returned.
The client can then make a subsequent request to the server to resume the operation:
// request wire protocol{// the client must pass the same variables and query as// the original request"variables": {"category": "Food"},"query": "...",// see query document above// pass one or more entries from the response's resumable array"resume": [{"id": "0","path": ["posts"],"token": "<cursor>"}]}
The server will receive this request and execute resolvers for the equivalent query of:
However, the results of the resolver will be delivered in the incremental field of the response, mapped to the same path as the resumable field:
// response wire protocol{// there is no "data" field in the response"incremental": [// the wire protocol from defer/stream is reused here{"id": 0,"label": "posts","items": [/* up to 20 more posts */]}],"resumable": {// assuming the server has reached the end of the result set,// it returns a "complete" entry in resumable instead of pending"complete": [{"id": "0","path": ["posts"]}]}}
The client can use the result of this followup request to append new items to the end of the previous result set, following the same mechanics as incremental delivery from @defer / @stream.
Not Just for Pagination
While the example above is probably the most common use case, the specific behavior of how to turn a resume cursor into incremental updates is left to the server implementation.
Let's imagine a very different scenario: a breaking news alerts system.
In this case, "resuming" the query doesn't mean fetching an additional page of existing results, but rather checking to see if there are any new alerts that have been created since the last time the client queried the server.
For this use case, the resumable is never "complete", because there is always a chance that new alerts will be created. A subsequent response might look like:
{"incremental": [{"id": 0,"label": "latestAlerts","prependItems": [{"id": "1","message": "Earthquake in San Francisco","createdAt": "2024-02-14T12:00:00Z"},{"id": "2","message": "Severe weather in New York","createdAt": "2024-02-14T12:01:00Z"}]}],"resumable": {"pending": [{"id": "0","path": ["latestAlerts"],"token": "<cursor>"}]}}
You'll note that the incremental response includes a prependItems field. This is an addition to the @defer / @stream protocol to account for the fact that incremental data may modify existing data in more ways than just appending to the end of the list.
Further Exploration Required
There's significantly more thought required to turn this into a full specification; however, I was surprised at how well this concept seemed to fit with the existing @defer / @stream primitives. While there's a fair amount of "client smarts" needed to do this properly, it seems pretty tractable since the actual updating of the data will be compatible with @defer/@stream.
I'd love to hear early reactions from this group to know if this is something worth my time to pursue further.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Defer and stream offer fantastic primitives for incremental delivery on a single chunked request, and I'm looking forward to using them. I also think the new wire protocol is well-crafted to enable pretty broad use case scenarios, which is what I wanted to bring up here.
I'm interested in GraphQL having better capabilities for common patterns like pagination without needing to rely on bespoke interfaces and per-application implementation at the schema level. I believe the incremental delivery protocol offers an interesting avenue to such things. To solve for these, GraphQL could support generic "resumable" operations.
I'm writing this up to see if there's broader interest in exploring this idea further -- if this seems promising, I can put together a more formal proposal and/or prototype a simple client/server implementation to demonstrate the concept.
New directive:
@resumable
Let's imagine a schema like the following:
With a corresponding query document that looks something like:
The
@resumable
directive can be placed on any field definition so long as the field takes aresume: ResumeToken
argument. It can also be placed on any query field so long as the corresponding field is marked as resumable.When both client and server have opted into resumable operations, the server will return additional metadata in the response following a similar pattern to the
pending
response field on@defer
/@stream
responses:This response indicates that there are additional results for the
posts
field that can be fetched by resuming the operation. If the server evaluates the query and does not find additional results, it returns a"complete"
entry in resumable instead of pending.The
token
value is an opaque cursor that the server will use to determine how to resume the operation on a subsequent request. In the example above, the token might be the ID of the last post that was returned.The client can then make a subsequent request to the server to resume the operation:
The server will receive this request and execute resolvers for the equivalent query of:
However, the results of the resolver will be delivered in the
incremental
field of the response, mapped to the same path as the resumable field:The client can use the result of this followup request to append new items to the end of the previous result set, following the same mechanics as incremental delivery from
@defer
/@stream
.Not Just for Pagination
While the example above is probably the most common use case, the specific behavior of how to turn a
resume
cursor into incremental updates is left to the server implementation.Let's imagine a very different scenario: a breaking news alerts system.
In this case, "resuming" the query doesn't mean fetching an additional page of existing results, but rather checking to see if there are any new alerts that have been created since the last time the client queried the server.
For this use case, the resumable is never "complete", because there is always a chance that new alerts will be created. A subsequent response might look like:
You'll note that the
incremental
response includes aprependItems
field. This is an addition to the@defer
/@stream
protocol to account for the fact that incremental data may modify existing data in more ways than just appending to the end of the list.Further Exploration Required
There's significantly more thought required to turn this into a full specification; however, I was surprised at how well this concept seemed to fit with the existing
@defer
/@stream
primitives. While there's a fair amount of "client smarts" needed to do this properly, it seems pretty tractable since the actual updating of the data will be compatible with@defer
/@stream
.I'd love to hear early reactions from this group to know if this is something worth my time to pursue further.
Thanks all!
Beta Was this translation helpful? Give feedback.
All reactions