Merged
Conversation
**Why**: The big UNION query takes so long it gets serialization errors in prod, so this reworks it to do more logic in Ruby and stream results
zachmargolis
commented
Nov 18, 2021
| iaa_end_date: iaa_range.end.to_s, | ||
| total_auth_count: auth_count, | ||
| unique_users: unique_users.count, | ||
| new_unique_users: new_unique_users.count, |
Contributor
Author
There was a problem hiding this comment.
technically we do not currently need new_unique_users for the by-issuer report.... however my thinking is maybe it's just easier to leave in here? I checked and our upcming ticket LG-4975 does not need this either, so it's dead code
Happy to remove it also and put it back if we need it later
stevegsa
approved these changes
Nov 18, 2021
Contributor
stevegsa
left a comment
There was a problem hiding this comment.
Excellent. Out of the box thinking....
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Problem: We're observing queries that take so long in prod that they get Postgres serialization errors (basically by the time the query finishes running, the results are too out of date with other updates)
Proposed solution has two parts
My thinking is that one of the reasons the queries take so long is they have to write a lot of data out to temporary tables to calculate new uniques, so maybe if we move the calculation to Ruby memory, we can put less strain on the DB, and the queries themselves should be able to simply stream results instead of waiting for the entire thing to be ready
I tried this out in a Rails console in prod. Before about 50% of runs would get the PG serialization error, and with this, my 1/1 attempt was fine, no error