-
Notifications
You must be signed in to change notification settings - Fork 75
Support serving of pre-GZIP encoded files #7
Comments
👍 for this one. |
I'd also like to point out that this shouldn't be about "pre-gziping", but rather pre-compressing in general, no matter what the encoding format is. For example, Microsoft and Google have both played with improved encoding schemes ( What this means is, in reality, I might want to pre-compile a file into multiple formats and have the proper one selected based on the I think of this as being very analogous to keeping images in both |
I found simple workaround based on URL redirect supported in HTTP: class Startup
{
private StaticFileOptions StaticFileOptions
{
get
{
return new StaticFileOptions
{
OnPrepareResponse = OnPrepareResponse
};
}
}
private void OnPrepareResponse(StaticFileResponseContext context)
{
var file = context.File;
var request = context.Context.Request;
var response = context.Context.Response;
if (file.Name.EndsWith(".gz"))
{
response.Headers[HeaderNames.ContentEncoding] = "gzip";
return;
}
if (file.Name.IndexOf(".min.", StringComparison.OrdinalIgnoreCase) != -1)
{
var requestPath = request.Path.Value;
var filePath = file.PhysicalPath;
if (IsDevelopment)
{
if (File.Exists(filePath.Replace(".min.", ".")))
{
response.StatusCode = (int)HttpStatusCode.TemporaryRedirect;
response.Headers[HeaderNames.Location] = requestPath.Replace(".min.", ".");
}
}
else
{
var acceptEncoding = (string)request.Headers[HeaderNames.AcceptEncoding];
if (acceptEncoding.IndexOf("gzip", StringComparison.OrdinalIgnoreCase) != -1)
{
if (File.Exists(filePath + ".gz"))
{
response.StatusCode = (int)HttpStatusCode.MovedPermanently;
response.Headers[HeaderNames.Location] = requestPath + ".gz";
}
}
}
}
}
public void Configure(IApplicationBuilder application)
{
application
.UseDefaultFiles()
.UseStaticFiles(StaticFileOptions)
}
} I used Wikipedia as reference. This approach also allows to use not minified files in development environment with no need to change links on client side. I found that currect implementation of 'aspnet/StaticFiles' lacks of two feature:
@davidfowl fyi. |
In support of my comment above, both FireFox and Chrome will be shipping support for Brotli compression soon. |
👍 Actually, even a good story for serving gzipped static files (with a cache, of course, since it's static compression) is currently lacking. I'm saying this in the "conventional", "IIS-kind" of way, as opposed to the build-time gzip generation. For those coming here from Google with dynamic content compression needs, you might try this gist: |
Would it be beyond the scope of this to suggest that StaticFiles should optionally allow cached "on-demand" compression of files? (I.e. gzip on first request and put result to file system cache, serve gzip requests from there) |
Really looking forward to being able to do this. I can go through all the trouble of bundling, minifying, tree shaking and gzipping to get things down to an absolute minimum size... but I can't serve those .js.gz files and it's problematic. I'm also strugglin to get IIS to gzip itself as well, but that's for another area entirely. |
With Angular (2) creating *.gz files when doing --prod builds, it would be great to have this out of the box. |
Other related use cases are serving pre-brotli compressed (.br) files and also serving WebP (.webp) files instead of PNG/JPG/etc. |
@JohannesRudolph you should now be able to combine ResponseCaching, ResponseCompression, and StaticFiles to achieve dynamic compression and caching of static files. @JunTaoLuo this would be a good combo to test. |
Can folks clarify if they expect the original url to contain the compression extension (e.g. ".gz")? Some of the above samples do and some don't. I assume content negotiation based on accept headers would be the more general case. |
Great! Content-negotiation is (from my experience) the far more often-used
and preferred method.
…On Fri, Feb 17, 2017 at 2:59 PM, Chris R ***@***.***> wrote:
Can folks clarify if they expect the original url to contain the
compression extension (e.g. ".gz")? Some of the above samples do and some
don't. I assume content negotiation based on accept headers would be the
more general case.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#7 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAH8NyZXuqJehp3rBf7VkFTq_XulZ8gYks5rdafHgaJpZM4CQ_LY>
.
|
@Tratcher do you have an example of combining those three bits of middleware to suit this scenario? Thank you. |
@JunTaoLuo can you do a sample? |
A small example would be really appreciated here if possible. Thanks in advance. |
You can take a look at the sample I have created at https://github.com/JunTaoLuo/MiddlewaresSample which uses ResponseCaching, ResponseCompression and StaticFiles to create, cache and serve different representations of the same resource. Here's the sample output where I made 6 requests to
|
Hmm, no logs for the compression middleware... |
@muratg why was this closed? |
I came across this because I've started using webpack to pre-gzip my js and css. The solution I came up with was implementing a custom IFileProvider based on the code from CompositeFileProvider. I set it up like this:
it uses a convention, if the requested file ends with .min.js or .min.css it will look for the same file name with .gz on the end, and if found it returns that. Then later I got the idea to add logic to try to create the .gz file if it does not exist, and then return that on success else return the original file. It seems to be working well, would appreciate any feedback on the implementation, found here: My solution is using standard .min.js and .min.css urls, but the .gz file is served, I'm not using .gz in my urls. One known issue is that this solution is bypassing content negotiation, and just giving you gzip whether you like it or not, but really not a big issue in my view for real browsers. |
That approach sounds like it would mess up the content-length and etag headers. |
@Tratcher could you elaborate on how that would get messed up? The IFileProvider is passing up the IFileInfo about the gz file so it would have the correct content length of the gz file, isn't that what it should have? We still have an opportunity to tweak the headers in OnPrepareResponse if there is something messed up, but I'm trying to understand what would be messed up and why. |
Content-length and gzip are rarely used together because the implications are really confusing. I'll need to verify, but I think the content-length is supposed to be the uncompressed length rather than the compressed length. StaticFiles also uses the length to calculate the etag, so your pre compressed file will have a different etag than the uncompressed version, even if the contents are the same. StaticFiles also has built in support for range headers, which refer to offsets in the uncompressed file. This won't work with compressed files. Managing pre compression in the file provider is inadequate, it needs to be built into StaticFiles to make the above scenarios work correctly (or at least be bypassed correctly). |
@Tratcher that makes sense for dynamic compression, but the issue here is pre-compressed static files and I think we would want content-length for any static file wouldn't we? and it should be used to calculate the etag I would think. In my scenario the .gz file is created by the webpack build process, while my FileProvider can be configured to generate the .gz file that is a secondary concern, mainly I am serving static files that are already gzipped. In my scenario without content negotiation the browser is only going to get the already compressed static file, the uncompressed file would not be returned unless the compressed file does not exist and could not be created. If the source file is modified newer than the compressed file I am regenerating the compressed file. |
@Tratcher so are you saying that for my pre-gzipped files I should remove the Content-Length and Accept-Range headers? |
Never mind on content-length, I re-checked the spec and verified that content-length and transfer-encoding don't mix, but content-encoding is OK. I still think the Range requests will be broken though, those offsets should be for the un-compressed representation. So yes, at least remove the Accept-Ranges header. I haven't found any spec references to confirm this yet. |
@Tratcher thanks! I am by no means an expert, but I'm still a little doubtful that I should remove the Accept-Ranges:bytes header. Other samples I have found for pre-gzipped files seem to include that header. I found this which seems to indicate chrome and firefox for example would store the file in its content-encoding format ie .gz and will send range requests for those files to complete a failed partial download. https://lists.w3.org/Archives/Public/ietf-http-wg/2014AprJun/0112.html |
I expect a problem due to the differences between static and dynamic compression. With dynamic compression a range request would be processed against the un-compressed content and then the result would be compressed. With static compression the range request is processed against the compressed content. How is the client supposed to tell the difference? There's no indication in the response that you used static or dynamic compression, except that dynamic compression often uses chunked rather than content-length. Maybe it doesn't matter so long as the behavior is consistent on a per resource basis. |
@Tratcher I found this on SO, which seems to indicate content-encoding is a property of the entity whereas transfer-encoding is a property of the message, so it would seem logical if dynamic compression used transfer-encoding and pre compressed static files would use content-encoding |
The edit on that answer is really telling.. |
yes I saw that, you notice I said it would be logical, not that is how it is done, it sounds like in practice dynamic compression does it wrong, they should use transfer-encoding but they don't because of browsers :-D but nevertheless I think it will be ok to keep the Accept-Ranges header |
Adding my implementation into the mix. Use case, client calls public class CompressionFileProvider : IFileProvider
{
private readonly IFileProvider _fileProvider;
private readonly IHttpContextAccessor _httpContextAccessor;
private readonly string _root;
public CompressionFileProvider(IHostingEnvironment hostingEnvironment, IHttpContextAccessor httpContextAccessor)
{
_fileProvider = hostingEnvironment.WebRootFileProvider;
_httpContextAccessor = httpContextAccessor;
_root = hostingEnvironment.WebRootPath;
}
public IDirectoryContents GetDirectoryContents(string subpath)
=> _fileProvider.GetDirectoryContents(subpath);
public IFileInfo GetFileInfo(string subpath)
{
if (_httpContextAccessor.HttpContext.Request.Headers.TryGetValue("Accept-Encoding", out var encodings))
{
if (encodings.Any(encoding => encoding.Contains("br")))
{
var compressedEncoding = _fileProvider.GetFileInfo(subpath + ".br");
if (compressedEncoding.Exists)
return compressedEncoding;
}
if (encodings.Any(encoding => encoding.Contains("gzip")))
{
var compressedEncoding = _fileProvider.GetFileInfo(subpath + ".gz");
if (compressedEncoding.Exists)
return compressedEncoding;
}
}
return _fileProvider.GetFileInfo(subpath);
}
public IChangeToken Watch(string filter)
=> _fileProvider.Watch(filter);
}
public static class ApplicationBuilderExtensions
{
public static IApplicationBuilder UseCompressedStaticFiles(this IApplicationBuilder applicationBuilder, IHostingEnvironment hostingEnvironment, IHttpContextAccessor httpContextAccessor)
{
return applicationBuilder.UseStaticFiles(new StaticFileOptions
{
FileProvider = new CompressionFileProvider(hostingEnvironment, httpContextAccessor),
OnPrepareResponse = ctx =>
{
var headers = ctx.Context.Response.Headers;
if (ctx.File.Name.EndsWith(".br"))
headers.Add("Content-Encoding", "br");
else if (ctx.File.Name.EndsWith(".gz"))
headers.Add("Content-Encoding", "gzip");
}
});
}
} |
This issue was moved to dotnet/aspnetcore#2458 |
Pre-GZIPping files is seemingly becoming more popular. This involves running a tool ahead of deployment that creates GZIPped copies of suitable files in the site, e.g. site.js => site.js.gzip. Then the file serving aspect of the web server will serve the GZIPped file when appropriate.
The text was updated successfully, but these errors were encountered: