-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support defer
#288
Comments
@wincent Any info you can provide for how this will work & what it'll entail to set up on an existing GraphQL server? 👍 |
@devknoll: Internally, the way You can see code in Internally, we have a "batch" endpoint for the incumbent version of GraphQL that knows how to understand this kind of graph of queries with possible interdependencies. It also understands that one of these dependent queries may need to reference the result of the query that is its dependency, and it knows how to orchestrate all this and flush it all back to the client in chunks. When we open-sourced GraphQL, we did not include an equivalent notion to this batch endpoint because we wanted to keep the spec minimal and we view the batch endpoint as a separate layer above and not part of GraphQL itself. Additionally, we wanted to take the opportunity to reset and revisit the assumptions that we'd accumulated over years of internal GraphQL use. We didn't want to bake too much in, in a way that would limit our options for implementing batch-like semantics, or others like streaming, prioritization and subscriptions and so on. Many of these ideas are being discussed on the GraphQL Slack channel and on GitHub if you want to learn more. Anyway, in terms of what all this means for The short-term plan is to enable use of The longer-term plan is to continue to work with GraphQL to flesh out the semantics and directives that would need to be in place to get a more integrated approach to deferring data, one which wouldn't depend on client-side management of multiple round trips. We're being deliberately non-committal about this because we want to involve the community and don't want to commit prematurely to a course of action that could limit our options later on. In the meantime, there is a mid-term possibility, which is that it's all just JavaScript: if you want to take the client side batching logic and put a server-side layer in front of your GraphQL endpoint that gets rid of the roundtrips and does all the management server-side, then that should be totally possible. But for now, the immediate step is to get the purely client-side prototype out the door, which is what I am working on. |
Any reason why
That's of course the short-term plan described in the post above, but I'm perfectly happy to accept the overhead of extra queries for the deferred functionality to be available |
@anytimecoder: I'm guessing it only seems to work because the In the real batch implementation, we have the notion of dependent queries, such that you might have an initial query |
@wincent you're right - I'm deferring independent queries and I don't care about the order they're coming back, but I'm happy to comply to these restrictions to have |
What is the progress on this issue? Can we expect to see defer in the OS version any time soon? Thanks! |
@A3gis Thanks for asking. We aren't actively working on this right now and probably won't be able to focus on soon (see the Roadmap and Meeting Notes for current priorities of the core team). However, we're happy to support the community in building support for this in an open-source network layer. This could be a good candidate for adding to https://github.com/nodkz/react-relay-network-layer, for example (cc @nodkz). |
Just released new version of [email protected]:
Right now So I'll try to play deeper with defer fragments in near future. |
I have started to work on this for a pure client side solution as @wincent suggested in one of the posts above. For some use-cases simply activating I managed to hack together a NetworkLayer that gets around this invariant, by accessing The other thing I don't entirely understand is the
As long as I just have What I would have expected is simply these 3 queries:
2:
3:
Then at least conceptually all queries should work independently. Relay only needs to know that 2 and 3 are deferred, so it can render as soon as 1 has arrived. How come there is this distinction of ref queries? Is this normally handled by the server? How exactly can one use this batchCall information that is stored on the query? Would my suggested approach make sense for handling everything on the client side? Or given some other directions I would be happy to take a stab at this. @nodkz have you had any chance to look into this more? |
Great questions. Deferred queries use "ref" queries to handle cases where a fragment is deferred at a point without a known ID. In the example you gave, it's possible for the value of To handle ref queries, you'd have to delay fetching them until the query they reference is fetched, extract the value from the parent, and use that as the input to the dependent query. Feel free to share a gist with what you have so far, and I'd be happy to comment with suggestions for handling the ref query case. However, we have some good news on this issue: we've implemented a variant of our batch API in pure JavaScript as a layer over the standard GraphQL resolve function. The actual async function to resolve a query is passed as an argument, so it can be used by Relay on the client (with multiple round trips) or in the server to fetch multiple queries in one http request (if you're using node & GraphQL-JS). We'll be open sourcing this soon as part of the new relay core. |
@josephsavona Yes, at the very least for understanding's sake it would be great if you could comment on my little hack: https://gist.github.com/Globegitter/553f7dffd1f7ceaead1aba0dd56c5554 Even though we are not using js on the backend this is great news! Either if we can use it on the client-side or just port it ourselves to python. And this batch API would just as-is take care of deferred queries? |
I'm going through and cleaning out old issues that were filed against Relay 1, so I'll close this one. As @josephsavona mentioned, we have the primitives in place to straightforwardly implement a |
@wincent @josephsavona @devknoll does Relay 2 officially support 'short term' I have a similar user case as your "post and comments" example. At the moment we are using relay 1 and I'm handling the 2 queries manually. Like you said, the comments query depends on post query because we don't know the post id until the post query is resolved by server (we only know keywords initially). This is my demo code
as you can see, the
My questions are:
|
(i know it's awkward to explicitly pass |
The initial release of the new core and API will not include Having said that, the core primitives are in place to make it possible to add this, so it's something that we look forward to doing after the release. Clearly there is an interest in making a convenient abstraction over this usage pattern and other similar ones around streaming data, optional data etc. |
thanks @wincent . is it a way for client to know when deferred queries are resolved (like
|
Given that there is no support, no. When the feature is actually built, we'll have some mechanism in place for this. For now the way to fake it is to either render a new root container after the initial fetch and render (triggered by calling
In practice, we don't recommend implementing |
thanks @wincent that helps me a lot! |
In our public talks we've discussed Relay's ability to divide your queries into required and deferred parts, wherein your component won't get rendered until all of the required data that it requested is available, and it will get rendered again subsequently once the deferred data has arrived. There are a bunch of use cases for this, but the typical pattern is dividing your UI into a cheap, critical section that must be displayed fast before you can declare TTI, and a potentially more expensive but less crucial section which you're happy to display later on (example: a post [required] and its attached comments section [deferred]).
So we've talked about it, it works internally at FB, and you can see traces of support for in the code (such as this), but it's not currently exposed externally. This issue is for tracking progress towards getting this ported and released in some form.
(Extracted per comment here.)
The text was updated successfully, but these errors were encountered: