Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Purge the query cache to prevent GC livelocks #7370

Merged
merged 3 commits into from
Sep 9, 2024
Merged

Purge the query cache to prevent GC livelocks #7370

merged 3 commits into from
Sep 9, 2024

Conversation

teh-cmc
Copy link
Member

@teh-cmc teh-cmc commented Sep 7, 2024

When the GC needs to reclaim memory but has already exhausted all the data from the store, then purge the query cache directly.
The complete rationale for this patch is there: #7369 (comment).

Before:
image

After:
image

Checklist

  • I have read and agree to Contributor Guide and the Code of Conduct
  • I've included a screenshot or gif (if applicable)
  • I have tested the web demo (if applicable):
  • The PR title and labels are set such as to maximize their usefulness for the next release's CHANGELOG
  • If applicable, add a new check to the release checklist!
  • If have noted any breaking changes to the log API in CHANGELOG.md and the migration guide

To run all checks from main, comment on the PR with @rerun-bot full-check.

@teh-cmc teh-cmc added 🪳 bug Something isn't working 🔍 re_query affects re_query itself 🚀 performance Optimization, memory use, etc include in changelog labels Sep 7, 2024
@jleibs jleibs merged commit a150b3c into main Sep 9, 2024
34 checks passed
@jleibs jleibs deleted the cmc/gc_leak branch September 9, 2024 13:18
@jleibs jleibs added this to the Next patch release milestone Sep 9, 2024
@ricpruss
Copy link

Sorry this is not yet fixed in the latest nightly build for us.
GCLock
I can send another recording if you need. Possibly something else is breaking this again.

@jleibs
Copy link
Member

jleibs commented Sep 16, 2024

It looks like our nightly might have been red for a while in there -- just to double-check, could you confirm the sha in the about panel of the viewer where you're seeing the issue?

e.g.
image

@ricpruss
Copy link

rereunworking

Good news! We had rerun going at full power for 30 mins with a 4GB limit enabled. It hasn't really gone over 3.3GB. We super happy thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🪳 bug Something isn't working include in changelog 🚀 performance Optimization, memory use, etc 🔍 re_query affects re_query itself
Projects
None yet
Development

Successfully merging this pull request may close these issues.

GC livelocks when temporal data is exhausted
4 participants