-
Notifications
You must be signed in to change notification settings - Fork 493
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Validation slows on high number of aliases #499
Comments
@s1na is this something that became slow recently? Was it more performant before? Are you repeating the same fragment 2000 times? |
@pavelnikolov before |
Do you think that we must be reusing validation? I haven't had the time to dig deeper into this issue, however, having 2000 objects being returned doesn't sound like something extremely performant. Are you displaying these on some UI or are you processing these in the backend? |
Sorry to get back to you late on this. The problem isn't just that requesting 2000 objects is slow. Surprise is that making one query for 2000 objects is much slower than 10 queries for 200 objects. I did a profiling of our code when executing this query and the profile seems to suggest the problem lies in Edit: hmm for some reason the svg file is not loading completely. I'll just upload the profile itself. |
As far as I can tell this is not an issue in the I found this great explainer article which also shares a faster algorithm. Their algo was merged in a few implementations (reference: sangria-graphql-org/sangria#12). It's not a small change so I wanted to ask you if you'd consider deviating from spec to implement this optimization? I don't have enough understanding of it to tell if it improves the general case or only some cases. |
|
Hey there! I'm using
v.1.3.0
of this library to query a high number (say 2000) of the same thing:And noticed this becomes way slower than splitting this query into pages of 100 each. I narrowed down the slowdown to the first loop of
Validate
. I'd appreciate if you could take a look and see why this is happening.The text was updated successfully, but these errors were encountered: