-
Notifications
You must be signed in to change notification settings - Fork 194
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unusually high CPU use from beam #1062
Comments
One point. One of the modules it is complaining about is basically 3 macros. It still has beam running if I turn off build when save. |
Please post a small repo that reproduces the issue. |
So I think the Is there a way to stop VScode to give up retrying? I think that was the most annoying. |
So did fixing it resolve your perf issue?
I'm not sure I understand what you mean here |
It appears to have. But if you look at the log: https://github.com/nhpip/stuff/blob/master/ElixirLS/elixir-ls.log It just got stuck in an endless retry-loop and there was no way to get out of it. |
I can see a few crashes related to OTP node name. Besides that a lot of crashes related to macro expansion and invalid yaml. The yaml thing is something on your side. The macro expansion ones are tricky - it is not possible to expand macros when they are not yet compiled |
Environment
Erlang: 26.2.1
Elixir: 1.16.0-otp-26
VsCode: 1.86.1
ElixirLS: v0.19.0
OS: macOS Ventura 13.5
Current behavior
When auto save builds run beam CPU spikes at over 900% rendering many tasks unusable. It was running fine until I upgraded VsCode (Erlang and Elixir were recently upgraded too).
Looking at the logs it looks like it's stuck in an endless loop.
One of the files in the logs is basically one big macro, but it compiles fine when using
Mix
.I ran against Observer.
You will find observer screen shots and ElixirLS logs here:
https://github.com/nhpip/stuff/tree/master/ElixirLS
Expected behavior
Not making my CPU fan sound like it's about to take off.
The text was updated successfully, but these errors were encountered: