-
Notifications
You must be signed in to change notification settings - Fork 229
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
backport iojs v1.0.0 streams changes #98
Conversation
did you do this manually or did you use build/build.js in some way? |
This was done semi-manually (with rewrite-js and bash). I totally missed the build.js file. |
1268b34
to
2bb6a6a
Compare
Should be fixed now. The first commit updates build.js, the second updates lib, and the third updates the test dir. These should probably be squashed before merging. |
I'm intrigued does this mean that the a writable no longer has an array of outgoing buffers? |
It is quite crazy actually as I had been pondering on the idea of whether a linked list would improve performance stream performance sometime before xmas. |
@sonewman Yep, writable is now backed by a queue (based on a linked list) so that repeated small writes no longer thrash GC quite so badly. Prior to this, for most writable streams backed by async resources (re: "fs.WriteStream"), each fulfilled |
That's really awesome, good work dude 😃 |
I'm wondering if we could do the same thing for the readable buffer. I found it really interesting experimenting with linked lists, after a certain amount they were much quicker to iterate over than an array is. It must have something to do with the memory allocation and optimisation for smaller arrays (not that i know anything c++ or v8). My tests showed considerable difference 100,000 items, - which is an unlikely amount of buffers to have in a readable stream waiting to be consumed in process memory. But still, i will have to dig them out and put them in a gist somewhere. If for some reason the stream was destroyed though, and the first item in the list was set to null, I assume that would GC the remaining items? |
@sonewman I implemented a queue on the readable side, but it seems like it didn't move the needle at all in terms of performance. |
I am having some difficulty applying the patch, but I am interested in this idea, so I am going to take a look tomorrow (as it is 23:23 in the UK). Happy Friday 🎉 |
@chrisdickinson I managed to apply your patch manually & I see what you mean. It gives pretty much the exact same bench on paused/readable led streams (obviously flowing on I was going to have fiddle and see if changing the prototype methods to plain functions would make any changes, in any case the speed is a matter of milliseconds (suggesting that this is incredibly well optimised either way). The other interesting idea I had (which could be controversial) was whether there was any optimisation to be had by using a linked list in an EventEmitter. I will perhaps have a play with that at some point in the week. I am happy to merge this, but It would be great to get some more eyes on it first @iojs/streams. |
Any objections to this? Because otherwise this should just go in since it's already in io.js |
Rubber-stamp LGTM. Everything matches my expectations of what's released in io.js today. Moving forward, of course, changes should be made here first, and then merged into io.js. Anything already in io.js should certainly be backported here. |
Yeah I agree totally. OK, in it goes... |
backport iojs v1.0.0 streams changes
This backports the various streams changes as of iojs v1.0.0, including tests.
R=@rvagg