-
-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Uploading crashes the application #48
Comments
How are you uploading files? |
I use Multer (https://github.com/expressjs/multer) class MegaStorage {
constructor(opts) {
this.opts = opts;
}
_handleFile(req, file, cb) {
if(!this.opts.email) return cb(new Error('no email'));
if(!this.opts.password) return cb(new Error('no password'));
let mega = require('megajs');
mega(this.opts, (err, storage) => {
if(err) return cb(err);
file.stream.pipe(storage.root.upload(file.originalname, (err, file) => {
if(err) return cb(err);
file.link((err, link) => {
if(err) return cb(err);
cb(null, {
url: link
});
});
}));
});
}
_removeFile(req, file, cb) {
console.log("remove");
// TODO: Add remove file
}
}
module.exports = function(opts) {
return new MegaStorage(opts);
} |
Is it something like this? https://repl.it/repls/WoefulQuestionableMicrostation
Edit: Now I noticed you're using Storage Engine. I will edit the repl. Edit 2: I got Multer working and it's using Buffer streams (not File streams, like gulp object streams). Turns that your code is similar to this test. The second line on your call stack is this line on the source. For some reason the "data" event is being called before the input stream is resumed by Try to use buffers, streams are hard (but if you can find a better solution, you're welcome). |
Okay thanks i will test a bit around. |
I remembered something: you're using streams and you have the file size. Pass the file size to the upload function so it will not need to buffer the entire input stream to guess the file size. - storage.root.upload(file.originalname, (err, file) => {
+ storage.root.upload({
+ name: file.originalname,
+ size: file.size
+ }, (err, file) => { |
I noticed some things:
I can be a issue on the upload chunking code. It's hard: tonistiigi's code uploads files using a single connection, but seems MEGA don't supports that anymore. I implemented chunking to fix #13 and #43. Seems this code is bugged when handling streams like those from request. Larger the uploaded file then higher the chance of triggering the bug. I don't know what exactly is causing that. Can someone check this code? |
How about replacing request? We can use isomorphic-fetch. As now uploading and downloading are chunked by default we don't need to use ReadableStreams to reduce memory usage. |
I checked other libraries: superagent is 6.2 kB, axios is 4.3 kB, isomorphic-fetch is 2.7 kB, but those require a promise polyfill, so we need to add an extra 2.4 kB. The current "request-like" module is way lighter than those in browser. In the other hand in Node the request library is quite heavy and maybe is the source of those issues. Seems most people use the Node build, then we can do the following: update the current code to use isomorphic-fetch and add es6-promise to the browser version. If raw fetch gets complicated then we can switch to axios, but I don't think that's a problem as there's already some abstractions in api.js. |
Is it working for you now? |
The text was updated successfully, but these errors were encountered: