-
-
Notifications
You must be signed in to change notification settings - Fork 825
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Speed up Download Spreadsheet from SearchKit results #26214
Speed up Download Spreadsheet from SearchKit results #26214
Conversation
(Standard links)
|
989466b
to
d02b388
Compare
Related issue on the potential to speed up SearchKit in general by optimizing how we check access for the menu link (View / Edit / Delete), which is where a lot of the time is going. |
I like the concept! 2 thoughts about the implementation:
|
d02b388
to
0f8de2a
Compare
I don't think Yes, that makes more sense to check for the |
Ok, great @larssandergreen please just add a comment on l57 to explain that non-spreadsheet columns are being unset for performance reasons. |
0f8de2a
to
6950579
Compare
@colemanw thanks, comment added |
Thanks @larssandergreen for this PR, I'm sure everyone with big spreadsheets to download will appreciate it! |
@larssandergreen I'm curious to know what tool you used to determine which part of the code was taking the most time. |
Overview
Downloading a spreadsheet from SearchKit results is often slow when you have even a thousand rows. Timeouts are common, depending on your environment, with a few thousand rows.
It turns out that almost all the execution time was being used to generate the menu column, which of course isn't included in the spreadsheet download, so doesn't need to be generated. I suspect there is also some work to be done to make that faster in general to speed up results in SearchKit (not a big deal with 50 rows, but if you increase the page size it gets slower).
Related issue.
Before
Slow! Timeouts! Can't download more than a couple thousand rows.
After
At least an order of magnitude faster, for simple rows. Timeouts much less likely.
Can download 10,000 plus rows (I was hitting memory limits while testing, rather than timeouts).