-
-
Notifications
You must be signed in to change notification settings - Fork 791
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
A faster alternative to pako.js #136
Comments
Please, understand me right, i have a lot of good (i hope) projects to do and only 24 hour in a day. Every time i need to make a choice - can i add significant value or not. Process of polish is endless, and we have say "stop" to self at some point to continue with more valuable things. This project is very stable and battle-tested. So prior to start improvements, i need very strong proofs, that time will not been spent for nothing in the end. What i mean - it's not easy to make benchmarks correct. That depends on many things and even on node.js version used (v8 engine version). If you like me to participate in considering your acheivements, i strongly recommend to apply patches into existing pako fork and use existing benchmark. It will give some kind of guarantee, that everything else is correct. If your code is completely indenendent, it will require a huge time to analize what happens - that's very ineffective. Also, i recommend recheck benchmarks in node 8.2.1, because next v8 versions has significant perfomance loss. This approach will help to understand this things:
I don't mean you do something useless. But would like to see more deep and more careful investigation results. It may look that i'm stupid to reject "obvious WOW benchmarks", but i have reasons to do so, based on my experience. Trust me, i had a huge number of benchmarking issues and experiments when developped this package (and spent a lot of time for those). There are many cases, when "fantastic" results are caused by environment, not by measured code. So, i don't reject, but suggest to use more accurate/safe/predictable benchmarking approach. PS. Of cause if you need help with promoting your initiative - i will be happy to help, even if not participate myself. Just dont forget to close issue at some moment, in 1-2 months for example, or if you loose interest :) |
I am not asking to change pako.js in any way. I understand, that it is a rewrite of ZLIB, it is stable and properly tested. My library is like a new ground for experiments, for people, who would like to "get away" from ZLIB and its structure. Actually, I think changing ZLIB may be very hard (that is why I did not do it). I will definitely close it at some point :) |
Why don't you wish to experiment with pako sources in fork?
In theory - less efforts for more "added value". |
I can not read or rewrite other peoples code, it is extremely uncomfortable for me :( I can only use such code through a described interface, or add new parts to it, that cooperate through an interface. I also have no experience with node.js and tools, that you use for processing and testing your code (I run my JS in a browser the same way I wrote it). I am still using pako.js for compression (because UZIP.js is qute unpredictable now), but I use UZIP.js for decompression (because it is always faster, the difference is even bigger in Firefox). I use different algorithms, but I guess I should suggest them to ZLIB, not to you. |
Maybe add a "Related Projects" section to the Pako Readme, and mention UZip there? I'm not sure about the benchmarking issue, but UZip does have the advantage of being entirely within one file, which makes it easier to include in browser scripts etc without having to get browserify to work. I'd like to try using UZip for the Javascript output of my Annotator Generator—currently if you say One slight problem with UZip though, it's not quite correct to say in its Readme "The API is the same as the API of pako.js" because currently UZip's |
Following up my previous comment, I think I picked the wrong example because
so I expect a string representation to save 20% on average. It's a pity Javascript doesn't come with Base64 or something as standard. (You're more restricted if you can't assume libraries will be there.) |
@ssb22 https://github.com/nodeca/pako/tree/master/dist. Please, let's keep this repo for issues about pako only, when my personal participation is absolutely nesessary. |
@ssb22 We can discuss UZIP-related stuff at https://github.com/photopea/UZIP.js/issues . Also, if you need to store binary data directly inside a JS code, there is base64 for JS called "btoa" and "atob" https://developer.mozilla.org/en-US/docs/Web/API/WindowBase64/Base64_encoding_and_decoding |
Good point. I have now opened the API-compatibility issue on UZIP's repository, and created Pako pull request #137 for my suggestion to mention UZip in Pako's Readme (and marked the pull request as closing this issue). Hope that was vaguely the right thing to do. |
Hi guys, I was curious if I can make a faster alternative to pako.js , so I made UZIP.js . Actually, it is the whole ZIP parser and generator (alternative e.g. to JSZip), all inside 28 kB unminified.
I made it from scratch, without rewriting other implementations (like ZLIB).
Here I compress a 11 MB BMP image. It is faster and better than pako, but it usually does not work so well e.g. for plain text.
It is not an issue, but could we keep it here for a while, so I can maybe attract some people interested improving it, independently on ZLIB? We could add Zopfli-like optimizations, etc.
The text was updated successfully, but these errors were encountered: