-
Notifications
You must be signed in to change notification settings - Fork 10k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Large File Upload Issue (still) with MultipartReader in ASP.NET Core -8+ #58233
Comments
Thanks for writing that up and sorry for the time and frustration that led to it. It's true that huge files don't "just work", at least in MVC, but the particular bottleneck will vary from app to app. The docs you linked pointed to a file upload sample that can handle 5GB files with minor adjustments. I had some trouble following where |
Hi @amcasey, If you read through the code snippet I have posted it is commented (hopefully clearly) as exactly what is happening with "where HeadersLengthLimit came into the problem". In a nutshell it "drains" the _currentStream first using the hardcoded (you cannot change) "HeadersLengthLimit = DefaultHeadersLengthLimit" passed to the LengthLimit before it gets to the LengthLimit = BodyLengthLimit by throwing an exception. This is using postman to simple post a multipart file with nothing special in the Postman setup. It works fine with file roughly up to 2GB from 'My' memory. Now even if Postman is sending something strange (which I doubt it is, this should still never happen. Again as stated above, the size of the header and the body can technically be any size in the RFC so it shouldn't be being restricted at all here. (Yes in firewalls, web servers, etc. then yes (as it is a security risk.) I will have a look at your adjustments when I have a little time later. |
Hi @amcasey, I had a quick look at your link, but I couldn't fully understand what specific solution you were suggesting (admittedly, it was a quick glance). I attempted to make similar changes in my setup, but unfortunately, they didn't resolve the problem in my simple test case. The existing code is quite complex, and I think it would be very beneficial to have a straightforward example, as I outlined in my original post. Here's what I'm looking for in basic terms:
I've already started this process, as seen in my original comment. If we can get a basic scenario like this to work, I believe it would serve as an essential demonstration for documentation—showing that .NET can handle a large direct file stream to disk in the simplest form possible. From there, developers can decide on best practices or more advanced implementations, but the foundational "walk before you run" approach needs to work first. In conclusion note the code in the original post under "Once I got the code running somewhat correctly, it still would not allow the file to progress past 3,854,123 bytes. It just sits there until my own timer runs out." |
I agree that the existing sample is inadequate but, as you say, let's walk before we run. If you grab the branch from my PR, you should be able to compile a simple web app that has the basic characteristics you want - from a page, you can post a large file and have it written directly to disk on the server. I happen to have tested 5GB, but the difference between 5GB and (e.g.) 10GB shouldn't matter - the 32 bit boundaries are at 2/4GB. In that branch you can open, build, and run The interesting link is the last one. The handler is in UploadPhysicalFile. It doesn't require a custom My best guess is that your app is missing an update to |
Is there an existing issue for this?
Describe the bug
Large File Upload Issue (still) with MultipartReader in ASP.NET Core -8+
Problem (one of many) Summary
The real problem even noting the below is that .NET / ASP.NET is simply incapable of uploading large files 4GB+ which frankly is rather a joke in 2025. With .NET 9 on the horizon and yet this fundamental web task still seems to be ignored.
Note: This is running in Visual Studio 2022 only using Kestrel. No other web server is involved. Before anybody starts commenting about the various Kestrel settings that can be changed I have pretty much tried them all in some form. (unless there is some hidden strange one not commonly known.)
LengthLimit
is hard coded toHeadersLengthLimit
(16KB) during the constructor ofMultipartReader
.HeadersLengthLimit
before it's used because it's set after the constructor completes.InvalidDataException
, preventing large file uploads.Why This Is an Issue
HeadersLengthLimit
before it's enforced means developers cannot handle large file uploads. Given the violation of the Single Responsibility Principle discussed below, this limitation shouldn't even exist.multipart/form-data
, there is no specified maximum size for multipart uploads or headers. Therefore, this hard coded limit imposes an unnecessary restriction that is not compliant with the RFC.MultipartReader
class hinders developers from implementing essential functionality.Previous Reports of This Issue (still after seven years, I mean really)
This issue has been reported to yourselves multiple times over the years:
Violation of the Single Responsibility Principle
Now I always note "Rules are for the obedience of fools and the guidance of wise men." but in this case it seems a good call.
MultipartReader
class is responsible for parsing multipart data but also imposes hard coded limits on headers and maybe body sizes as well without allowing developers to adjust them .Issue Description:
Struggles and Final Thoughts
After struggling with this issue, I decided to write my own Multipart handler. But first, I wrote some code to directly take the uploaded stream and write it straight to a file without loading everything into memory. I tested it by attempting to upload a 5 GB file using Postman.
The following overly commented and overly verbose (not for production) code exists because if you cancel the upload in Postman or simply close Postman halfway through, Kestrel just keeps chugging along like nothing is wrong. It continues until it empties what I assume is its cache, and then just freezes. I’m sure there must be some timeout configuration to address this, but whatever it is, it’s incredibly long—almost like a DDoS/hacker's dream.
Once I got the code running somewhat correctly, it still would not allow the file to progress past 3,854,123 bytes. It just sits there until my own timer runs out.
Conclusion
After spending a week battling with this, I realised that ASP.NET and Kestrel fall short when it comes to large file uploads. The lack of clear documentation and flexibility makes it evident that they are not well-suited for handling this use case. This isn’t just a small oversight—it’s a fundamental flaw that I think requires a complete redesign, not just a minor fix.
What Did I Do in the End?
Although my Rust programming skills are a bit rough around the edges, I used Axum, and it just works. Hardly any code, really fast, and even though Rust is difficult to program, it didn’t take long to get it working.
I only add this for anyone in 2024 trying to do this with .NET: Don’t waste the week I’ve spent trying to get it to work. Consider alternatives like Axum if you need reliable large file upload support.
If somebody has managed to get this to work in .NET (without loading the whole file in memory for note) I would love to see a Full working example including all config around it, .NET and Kestrel.
References
Expected Behavior
To be able to Upload Large files...
Steps To Reproduce
No response
Exceptions (if any)
No response
.NET Version
.NET 8
Anything else?
No response
The text was updated successfully, but these errors were encountered: