-
Notifications
You must be signed in to change notification settings - Fork 508
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leak within rayon_core::registry::ThreadSpawn
#870
Comments
I suspect this is simply due to the fact that we never shutdown the global thread pool, by design. See also #688. That "possibly lost" record is just an allocation for thread-local storage, and I suspect the rest will be related to the thread pool's work queues. It seems valgrind is already filtering out the thread stacks, because those would also be 2MB each by default. All of that should be bounded though. If you find some memory use that increases over time, that may be evidence of a real leak. |
The leak size will not grow, lost memory size is always: I guess you are right, if we create some thread in Rust, and don't wait them, there will also this kind of memory leak: fn run_thread() {
use std::thread;
let builder = thread::Builder::new();
let handler = builder
.spawn(|| {
// thread code
println!("hello");
})
.unwrap();
//handler.join().unwrap();
}
fn main() {
run_thread();
} The result is:
|
Ok -- you'd have to talk to glibc and/or valgrind folks about recognizing that particular allocation, as it's not under Rust's control, let alone rayon. Either way, I don't think there's anything to be concerned about here, so I'm going to close. Feel free to let us know if you find something else! |
Hello,
I'm maintaining the https://github.com/wasmerio/wasmer project. Two of our users have reported memory leaks that seem to come from Rayon. I'm quoting the example from @chenyukang at wasmerio/wasmer#2404 (comment) which doesn't involve Wasmer at all, it's purely Rayon and it illustrates the memory leak:
Here is the Valgrind report:
Thoughts?
The text was updated successfully, but these errors were encountered: