Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leakage #113

Closed
src200 opened this issue Jul 17, 2015 · 14 comments
Closed

Memory leakage #113

src200 opened this issue Jul 17, 2015 · 14 comments
Labels

Comments

@src200
Copy link

src200 commented Jul 17, 2015

form.on('part', function(part) {
  if (!part.filename) return;
    var size = part.byteCount;
    fileName = part.filename;
    console.log('filename:'+fileName);
    console.log('size:'+size);
    randomFilename = UUID+'.'+fileName.split('.').pop();
    path = "./ContentCache/"+randomFilename;
    inputFile = fs.createWriteStream(path);
    // part.on('data', function(chunk){
    //    //console.log('got %d bytes of data', chunk.length);
    //    console.log(util.inspect(process.memoryUsage()));
    //    inputFile.write(chunk); 
    //    //inputFile.uncork(); 
    // });
    part.pipe(inputFile);
    inputFile.on('finish', function(){
        console.log('Local: Upload Successfull..!');
        res.send({msg: "Local: Successfully Uploaded!", objectId: req.query.objectId}); 
        encodingModule.getMetadata(path,function(data){   
             parseApp.update('ContentFile', req.query.objectId, {'fileId':randomFilename, 'videoCodec':data.streams[0].codec_name, 'videoBitrate':data.format.bit_rate, 'audioCodec':data.streams[1].codec_name, 'audioBitrate':data.streams[1].bit_rate, 'resolutionX':data.streams[0].width, 'resolutionY':data.streams[0].height}, function(err,response){
                if(err){
                    console.log("metadata:error updating object");
                }
                console.log(response);
             }); 
        });     
        videoQueue.add({video: path, filename:randomFilename, objectId:req.query.objectId});    
    });
});

//Parse incoming nodejs request
form.parse(req);

Here,when I tried to upload file of size >30GB, RAM uasge increases from 10% to 98%.After reaching 98% memory doesnt get freed up after file upload completes.Is there size limit for multiparty?

@csvan
Copy link

csvan commented Jul 19, 2015

From the docs:

"The max bytes accepted per request can be specified with maxFilesSize."

Also, while OT, why on Earth would you want to use Node at all for uploading files of that size?

@src200
Copy link
Author

src200 commented Jul 19, 2015

Oh..Let me clear that node doesnt support such large file uploads?@csvan

@csvan
Copy link

csvan commented Jul 19, 2015

@csharathreddy pretty sure it does if you just push your limiters high enough, but it just doesn't make sense. Anyway, the answer you are looking for is in the docs.

@src200
Copy link
Author

src200 commented Jul 20, 2015

No if set 'maxFilesSize=3 * 1024 * 1024' it accepts only 3MB data and then raises error event saying that "maximum file length exceeded"
My problem is if I upload 30GB file, RAM usage increases very fastly and after upload completes it doesnt free up the used memory.This causes other uploads in the queue to abort due to low memory.
@csvan can we do it like this,Whenever I upload a 30 GB file,it should read 3MB of data and write it to disk then immediately 3MB data which was in RAM should be released and then read next 3MB data from the request .
This is could be efficient.

@silverwind
Copy link

I also happen to be looking for a multipart parser that just streams directly to disk. I've taken the example code with these options:

{
  maxFieldsSize: Infinity,
  maxFields: Infinity,
  autoFiles: true,
  uploadDir: __dirname
}

Uploading a file bigger then the RAM + Swap of the machine seems to result in the process getting killed by the kernel oom killer eventually. Am I doing this wrong?

why on Earth would you want to use Node at all for uploading files of that size?

Could you elborate on why you think node it a bad fit? I don't see anything inherently wrong in wanting to upload big files to a node-based file server.

@silverwind
Copy link

Tried @dougwilson' suggestion with piping to fs, but I still seeing huge memory consumption unfortunately. Here's a simple test case. Uploading a 3.6GB file through it on my 4GB RAM machine looks like this (2 runs):

screen shot 2015-08-07 at 1 53 00 am

The ramp ups at the start is when it's uploading. After the file has been written completely, it seems that around 1 GB memory isn't freed. The short spike afterwards is when I kill the process. I also tried periodic gc calls, but it seems to not have any impact at all.

@silverwind
Copy link

Just dug deeper into this and I think it's an io.js 3.0.0 issue at least in my case. @csharathreddy what node/io.js version are you seeing this on?

Related: mscdex/busboy#92 and nodejs/node#2308

@src200
Copy link
Author

src200 commented Aug 8, 2015

Ahh..I have tried with different versions of nodejs starting from 0.8.0 to latest.

@silverwind
Copy link

Hmm that's a different issue than mine then. Maybe try my test case, it shouldn't leak in anything except io.js 3.0.0.

@dougwilson
Copy link
Contributor

@csharathreddy we cannot even begin to answer why your code has a memory leak, as the code posted above is not runnable by us; if you are unable to diagnose the memory leak, we need a complete test case where we can run the code and reproduce the issue.

@src200
Copy link
Author

src200 commented Aug 10, 2015

Testcase provided by @silverwind and mine is similar.I have posted just uploading part of code.Use @silverwind test case to test.

@dougwilson
Copy link
Contributor

Thanks, @csharathreddy , I just tried that, but I get the following error:

ReferenceError: UUID is not defined

Can you provide a complete test case where we can run the code and reproduce the issue? Trying to substitute your code into @silverwind 's code results in numerous undefined variables in your code.

@src200
Copy link
Author

src200 commented Aug 10, 2015

Use @silverwind code and try uploading a large of size >30 GB @dougwilson.my code has variables which are used as per my usecase

@dougwilson
Copy link
Contributor

Use @silverwind code and try uploading a large of size >30 GB

Ok, well, @silverwind already confirmed there is no memory leak with his code; it's an io.js 3.0 issue. Closing this issue then, as we confirmed there's no memory leak based on your instructions, thank you! If you want us to try on your code instead of silverwind's code, we'd need a way to run it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants