-
-
Notifications
You must be signed in to change notification settings - Fork 184
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Attempt to fix v2 file overwrite vulnerability #213
Conversation
@@ -250,7 +251,20 @@ Parse.prototype._startEntry = function (c) { | |||
|
|||
if (onend) entry.on("end", onend) | |||
|
|||
if (entry.type === "File") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not so sure if it's customary in this project, but I'd love to see a few comments explaining how this mitigates the security issue. May be valuable for someone maintaining this code in the future. Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another request would be to cover the edge cases that exercise this logic in unit tests.
LGTM, great work! |
Please don't merge this. This just perpetuates the use of ancient versions of Node.js; versions that reached their end-of-life dates almost two and half years ago; versions that stopped receiving security updates; versions that developers should no longer cater to. Packages like node-sass should follow in the footsteps of node-gyp: bump their dependency and release a new major version. If package maintainers want to stay on insecure versions of Node.js, then they don't really care about security, and they won't mind staying on an insecure version of tar, node-gyp, node-sass, etc. No offense intended, @BRKurek. Thanks for your efforts. /cc @xzyfer regarding node-sass. @isaacs, I think you had a good point with not wanting to do work to support old versions, but I think the security implications are important, too. This is just my opinion, and I don't claim to be a security expert. It just feels weird to backport a security patch with the main goal of supporting insecure versions of Node.js. I know that's not the only use case, but it seems like the one that's pushing this PR. |
@jordanbtucker the npm audit warning seems to be a bad CVE version range from the investigations done on the other thread. Saying this holds back the node community is counter productive. node-sass will bump the node-gyp version in a sem-ver major version for a legitimate reason. I would suggest this and the previous message be marked off-topic |
Maintaining support for insecure versions of Node.js seems counter productive to me. Case in point, npm audit has been complaining about this vulnerability for almost a month. Instead of writing code, developers are checking the GitHub issues to figure out why |
Did you find a solution for this? I think I have the same problem and don't know how to fix it: |
This isn't really about keeping support for ancient should-not-be-used Node.js release lines, its because:
So, the superficial purpose of this PR might appear to be helping users of end-of-life and insecure node releases, but that isn't actually the primary target, its to help all the users of node-gyp on current releases, who would benefit from a node-gyp@3 release that didn't trigger |
@sam-github Thanks for the comments. I understand that there are benefits to merging this PR, but I still think it's setting a negative precedent considering the steps that got us here. Regardless of whether this gets merged, node-tar should take a stand on security updates. Which versions are dead and will no longer receive security updates? node-tar knew about this vulnerability for a year before it was published, but they only patched v4. When it came to light that v2 was still a very popular version, they still were not proactive in patching two-year-old versions of node-tar written on a different code base. And I don't blame them for that. However, the lack of information on what versions are still supported, at least with security updates, is the systemic problem that this ordeal in hinging upon. And I'm guilty of it too. In my opinion, one of these things should happen.
|
@jordanbtucker if you're not able to remove v2 and v3 from public availability, then taking a stand is a bad idea. As long as npm allows to download/install package, every other package that depends on it would be happy to keep a deprecated version instead of spending time on upgrade. |
@amikheychik If node-tar deprecates v2 (or publishes a release schedule) then packages like node-sass won't still be waiting on a security update for v2. They will know that they need to upgrade to v4. I don't really care if packages stay on deprecated versions, but as soon as a security advisory is published, they'll know that they need to upgrade from that deprecated version or find an alternative package. node-tar can remove v2 from npm using |
@jordanbtucker unfortunately, in case of People don't need to move unless you make them, so doing |
@amikheychik If node-tar takes the stand that they will not publish a security update for v2, then node-sass will stop waiting on a fix for v2. That will put pressure on them to release v5 more quickly (or release a v5 that only bumps node-gyp and drops support for old versions of Node.js). We don't need to take the nuclear option of globally breaking npm installs. That would just reflect badly on node-tar and the npm ecosystem as a whole, and it would cause distrust among users. |
@jordanbtucker I understand your position. I personally distrust npm ecosystem by default anyway and curious to see how long will it take |
I just stuck my head out of my .Net hole to take a peek at the world of NPM, ran into this issue, cried for a bit and quickly jumped back into my .Net hole........ |
@gcbenjamin node-tar maintainers are not to blame. This package version should not be used anymore, but node-sass / node-gyp not having done the version bump is the root cause. Using old and unmaintained packages versions for the sake of keeping BC with old runtimes is a fail. Yes, I'm looking at you too JAVA ecosystem :) I agree too that the node-tar maintainers should not fix v2. node-sass team should just bump their 3rd party deps, et drop support for unmaintained node versions. Who uses node 0.10.0 anyway... ? |
@mpalourdio Totally agree and not looking to blame anyone, just pointing out my frustrations after wanting to use a js frontend framework but not being able to after following the frameworks install instructions! |
It IS frustrating, but hey, this is how open source software is :) |
There are other packages using node-tar. Case in point: lovell/sharp#1665. |
That's debian problem in this case, as they rely on an outdated sharp version |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for getting this started.
However, I think it'd be better to remove a hardlink in the way (or even, any file at the target location when writing a File entry), rather than skipping file entries that might write to it.
Also, this seems to only address cases when a single tarball has a Link followed by a File entry. It's possible to still exploit if one tarball creates a hardlink, and then a subsequent tarball extracted in the same location writes to that file. Of course, that's a much harder exploit to pull off, but the fix in v4 (and those in bsdtar and gnutar) does guard against that as well.
Other changes are relatively minor.
I'll dig into this on my machine and see if I can resolve the issue right now. Leaving as "request changes" for now, but I'll close this PR if I can get it to a state to land.
Thanks!
@@ -250,7 +251,20 @@ Parse.prototype._startEntry = function (c) { | |||
|
|||
if (onend) entry.on("end", onend) | |||
|
|||
if (entry.type === "File") { | |||
this._hardLinks.forEach(function(link) { | |||
if (link.path === entry.path) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
While this addresses the vulnerability, it's not an ideal fix, because it also changes the behavior of the export.
A better solution would be to remove the file (or even just rimraf.sync()
before creating any File type, which will slow things down, but it's a very old version that is already being migrated away from anyway.)
@@ -38,6 +38,7 @@ function Parse () { | |||
me._stream = new BlockStream(512) | |||
me.position = 0 | |||
me._ended = false | |||
me._hardLinks = [] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A link -> target
map object would be a more efficient and useful data structure than an array.
@BRKurek I saw your comments at #212 (comment) and #212 (comment) that seemed to imply node-tar v2 is not affected by CVE-2018-20834 (that is https://hackerone.com/reports/344595). Why this PR then? Were you able to exploit it? |
This is an attempt at fixing #212. The basic idea behind the change is that it ignores any files in the tar that might be potentially unsafe to write (a file is considered potentially unsafe to write if a hardlink with the same path was previously seen in the tar). This might make tar extraction with hardlinks not work as expected, but, as I pointed out in #212, v2 already has a buggy implementation of tar extraction with hardlinks. Even so, this logic will only be run if there happens to be a hardlink in the tar with the same path as a file later in that tar, and, outside of malicious tars, I doubt this happens.
If anyone else has any ideas for how to potentially fix this please let me know. My main concerns with this fix are the one I mentioned above
and having to run through all previous entries for each entry.EDIT: I updated the code so that it only keeps track of hardlinks seen, so now it only goes through previously seen hardlinks for each file instead of going through each previously seen entry