Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Potential recurrence of #13309 #31900

Closed
josalem opened this issue Feb 7, 2020 · 10 comments
Closed

Potential recurrence of #13309 #31900

josalem opened this issue Feb 7, 2020 · 10 comments
Assignees
Milestone

Comments

@josalem
Copy link
Contributor

josalem commented Feb 7, 2020

I'm experiencing this issue with .NET Core 3.1

This causes the Linux container to hit OOM. Running the app in VS in debugger has the same issue.

Unfortunately, I cannot share my memory dump. But the summary is that I have the Azure Service Bus lib in my app. The service bus lib tries to connect to the topic you specify. If the topic does not exists it keeps throwing exceptions indefinitely. This causes the AI to keep gobbling memory.

This also causes the container to fail to swap at release time bc it runs OOM.

Originally posted by @nurevaSaeed in #13309 (comment)

@Dotnet-GitSync-Bot Dotnet-GitSync-Bot added the untriaged New issue has not been triaged by the area owner label Feb 7, 2020
@josalem
Copy link
Contributor Author

josalem commented Feb 7, 2020

@nurevaSaeed, could you describe the scenario a little more? Specifically

  • What applications are running (including your own) inside your container or when you are debugging with VS?
  • Are you tracing your application using dotnet-trace or using another .NET Diagnostics Tool when this happens?
  • Are you using DiagnosticListener in your application?
  • Are you using prometheus-net.DotNetRuntime in your app?
  • What specifically makes you think you're seeing a recurrence of [2.2] Potential native memory leak from EventPipe #13309, e.g., are you seeing a leak of buffers of the same size as described there?

Lastly, could you explain this a bit more:

This causes the AI to keep gobbling memory.

Are you talking about Application Insights?

If possible, a minimal repro of the behavior, some output from the !heap -stat command showing the leak, or some other concrete evidence of the issue would be very helpful in triaging this.

@nurevaSaeed
Copy link

Sorry @josalem, I'd deleted my post before seeing your message. I will send an update with what you're asking soon. For now, I don't think my issue applies to this bug anymore.

@tommcdon tommcdon added p1 and removed untriaged New issue has not been triaged by the area owner labels Mar 7, 2020
@tommcdon tommcdon added this to the 5.0 milestone Mar 27, 2020
@tommcdon
Copy link
Member

@semyon2105 are you still experiencing the CPU issues mentioned in djluck/prometheus-net.DotNetRuntime#6?

@semyon2105
Copy link

@tommcdon yes, still seeing the same CPU increase in repro using the latest .NET SDK Docker image

@tommcdon
Copy link
Member

@sywhang can you take a look?

@sywhang sywhang self-assigned this Jul 28, 2020
@sywhang
Copy link
Contributor

sywhang commented Jul 28, 2020

@semyon2105 What version of .NET Core 3.1 are you using?

@semyon2105
Copy link

@sywhang I'm using version 3.1.6

@sywhang
Copy link
Contributor

sywhang commented Jul 28, 2020

Thanks for confirming that. Just to make sure, is https://github.com/semyon2105/leak-repro still a valid repro for your issue?

@semyon2105
Copy link

Yes, and I've just updated it to use the latest .NET image

@sywhang
Copy link
Contributor

sywhang commented Jul 30, 2020

@semyon2105 Thanks. I tried running your repro and have some findings I'd like to share, but since the issue is unrelated to any runtime issues (including the one posted here), I will post my findings in the issue you initially reported it in the prometheus-net repo.

@josalem I believe the recurrence that you saw here is related to the mmap fix I made back in 3.1.5 - I'm going to close this issue since we've addressed that issue.

@sywhang sywhang closed this as completed Jul 30, 2020
@ghost ghost locked as resolved and limited conversation to collaborators Dec 10, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

6 participants