Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

util: limit util inspect depth to 10 #20627

Closed
wants to merge 3 commits into from

Conversation

BridgeAR
Copy link
Member

@BridgeAR BridgeAR commented May 9, 2018

Due to the controversial change about having Infinity as limit,
this is going to change it to 10 instead. That should be a much
safer default and it should still provide the best experience for
most users. Besides that, I added a absolute limit of 1000. So
if a higher limit is passed through, it would just limit it to 1000.

Nevertheless, this is not going to prevent an application from
crashing if the passed in object is to big. For this problem, there
is another PR open: #19994.

I actually also have a alternative implementation that allows to
inspect very deep nested objects without getting a maximum
callstack size exceeded error. I was for example able to inspect
the same linked list as used here in the test up to a depth of
98000. The problem is that this will take some time and can
result in the heap memory not being sufficient anymore so the
application can crash. Since that should not be possible, having
a strong limit seems like the better approach (using a absolute
limit there will not work because we do not know the shape of
the object). I can open that PR as alternative nevertheless, if
requested.

I used the test from #20017. This PR is not semver-major because
it relies on a semver-major commit.

Checklist
  • make -j4 test (UNIX), or vcbuild test (Windows) passes
  • tests and/or benchmarks are included
  • documentation is changed or added
  • commit message follows commit guidelines

BridgeAR and others added 3 commits May 9, 2018 15:27
Due to the controversial change about having Infinity as limit,
this is going to change it to 10 instead. That should be a much
safer default and it should still provide the best experience for
most users.

Nevertheless, this is not going to prevent an application from
crashing if the passed in object is to big. For this problem, there
is another PR open: nodejs#19994.
Make sure that a long singly-linked list can be passed
to `util.inspect()` without causing a stack overflow.
This makes sure Infinity will only recurse up to 1000 times as it
could otherwise cause more issues.
@BridgeAR BridgeAR added the semver-major PRs that contain breaking changes and should be released in the next major version. label May 9, 2018
@nodejs-github-bot nodejs-github-bot added errors Issues and PRs related to JavaScript errors originated in Node.js core. util Issues and PRs related to the built-in util module. labels May 9, 2018
@@ -363,13 +363,13 @@ stream.write('With ES6');
<!-- YAML
added: v0.3.0
changes:
- version: REPLACEME
pr-url: https://github.com/nodejs/node/pull/REPLACEME
description: The `depth` default changed to 10.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: `10`?

Copy link
Member

@addaleax addaleax left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still kinda think that we should revert first and work onward from there, but not gonna block this.

// Use a fake "error"
isStackOverflowError({ name: '', message: '' });
}
return maxStackLimit;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I’m not sure, but I wouldn’t expect that stack overflows happen all at the same depths – some stack frames will necessarily take more space than others, so what this would give us is an upper bound rather than something that is necessarily usable.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is correct. That is also why I do not directly use it. In the end I use a absolute limit of 1000. That could be lowered if this function returns a low number. That is all I do with that. But I am perfectly fine with just always using 1000 as limit and to remove this again.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That would be fine with me, yea. (But then again depth: null kinda sounds like opting into accepting stack overflow errors to me 😄)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also pointed out above that I have a working patch that allows to inspect really deep nested objects without such errors. With that patch applied we could set a absolute depth limit of e.g. 10000 (anything above that is going to cause a lot of CPU time and the heap memory will not be enough at some point). What I do is to catch the error, bubble to the top and start over with the part that caused the issue and than assemble everything.

@mscdex
Copy link
Contributor

mscdex commented May 9, 2018

Why are we capping the depth to 1000? Why not just change the default limit and remove the max limit completely? I don't think we should be silently stopping the user if they really want a larger depth.

@BridgeAR
Copy link
Member Author

BridgeAR commented May 9, 2018

@mscdex because it is not possible to inspect any objects at a deeper level. With the linked list for example as done in the test, the current limit is ~1500. If the inspected object is bigger, that is going to be less. I could recover and start over as I pointed out but that could result in the heap not being sufficient and that is probably worse.

@mscdex
Copy link
Contributor

mscdex commented May 9, 2018

I don't see what is wrong with letting the error happen if it becomes too deep. I think that is better than silently overriding what the user asked for.

@BridgeAR
Copy link
Member Author

BridgeAR commented May 9, 2018

@mscdex I tend to disagree. But I think it would be fine to print a warning in such a case. In general: I am not sure what benefit it provides if it throws an error compared to returning a value that is most likely what the user wants.

@mscdex
Copy link
Contributor

mscdex commented May 9, 2018

A warning would be good, but I fear that could get noisy very quickly.

Receiving an error is better because it lets the user know what they requested is not possible. I would not expect a function to return something I didn't ask for in a success case.

@mscdex
Copy link
Contributor

mscdex commented May 9, 2018

My other concern is what happens when the computed stack depth limit changes? We will have a stale magic number, I don't think that is good to have. Who is going to actively maintain that as V8, compilers, etc. change?

// calls that are maximal necessary for each depth level. That does not seem
// to be sufficient though, so let us just divide it by 10 and use 1000 as upper
// limit.
const MAX_DEPTH_LIMIT = Math.min(Math.floor(getMaxStackLimit() / 10), 1000);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This still seems too magical to me - no less magical than the depth: 2 default at least, but that's simpler.

@joyeecheung
Copy link
Member

I agree with @mscdex , the problem with depth: Infinity as default is that processes might freeze when dealing with large objects, and the user may not be able to know why. Having a hard limit at 1000 does not solve that issue, and in certain cases could make it worse because the process might've been able to error out if it keeps recursing to reach the overflow. The test case in this PR (not crashing with depth: Infinity) may very well take a long time to complete without crashing - which is the issue that we really should fix instead.

Copy link
Contributor

@mscdex mscdex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

-1 on this as-is for reasons previously stated

@BridgeAR
Copy link
Member Author

BridgeAR commented May 9, 2018

Having a hard limit at 1000 does not solve that issue, and in certain cases could make it worse because the process might've been able to error out if it keeps recursing to reach the overflow.

I am not sure I understand this correct. Without the limit it would error if the depth is actually above 1500. So it would actually recurse longer.

The hard limit has nothing to do with the process freezing. Please see that as a separate concern. I did that to prevent users running into errors when having a linked list (I can not think of any other object in the wild that would use such a deeply nested structure). Nevertheless, I am going to remove the hard limit and open a separate PR for that.

It would be nice to get some feedback about wanting to actually be able to inspect a even deeper nested object than it is possible right now. Because as I pointed out above: that is something I worked on as well.

@joyeecheung
Copy link
Member

joyeecheung commented May 9, 2018

I am not sure I can understand this correct. Without the limit it would error if the depth is actually above 1500. So it would actually recurse longer.

If the object consists of multiple subtrees deeper than 1000, then it would error out in the first subtree instead of keeping going. The current limit seems to only favor the case where there is only one deep subtree in the object.

I did that to prevent users running into errors when having a linked list (I can not think of any other object in the wild that would use such a deeply nested structure).

In my experience, SSR applications run into this if they somehow throw a virtual dom tree into util.inspect in some error handling code. There are also CMS applications that just involve complicated data structures representing nested content. Parsing HTML or other kinds of code could result in such structures as well.

@BridgeAR
Copy link
Member Author

I am closing this for now and I am going to work on the computation time first.

@BridgeAR BridgeAR closed this May 11, 2018
@BridgeAR BridgeAR deleted the fix-linked-list branch January 20, 2020 11:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
errors Issues and PRs related to JavaScript errors originated in Node.js core. semver-major PRs that contain breaking changes and should be released in the next major version. util Issues and PRs related to the built-in util module.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants