Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Double compression when using v1.20 in combination with gorilla/mux CompressHandler #1595

Closed
Mattie112 opened this issue Aug 21, 2024 · 10 comments
Labels

Comments

@Mattie112
Copy link

To start: I am not exactly sure if this issue belongs here, at https://github.com/gorilla/mux or at https://github.com/gorilla/handlers. I figured to start here as it possible other people are also approaching it from this angle.

The problem:
With v1.19 I could access the /metrics endpoint in my browser and it would work. Now with v1.20(.1) I get to see gzipped content.

I have traced it back to the usage of using the mux CompressHandler. So why am I reporting it here then? Because it occurs after a new release here so perhaps the bug is somewhere else but someone here has a clue.

Minimal exxample:

main.go

package main

import (
	"github.com/gorilla/handlers"
	"github.com/gorilla/mux"
	"github.com/prometheus/client_golang/prometheus/promhttp"
	"net/http"
)

func main() {
	router := mux.NewRouter()
	router.Handle("/metrics", promhttp.Handler())

	http.ListenAndServe(":8080", handlers.CompressHandler(router)) // Breaks
	//http.ListenAndServe(":8080", router)                           // Works

	select {}
}

In your go.mod:

require github.com/prometheus/client_golang v1.20.1
require github.com/gorilla/handlers v1.5.2
require github.com/gorilla/mux v1.8.1

(Tested with go 1.21 and 1.23)

If you open the /metrics in your browser you will get gzipped output. If you then roll back to v1.19 it works as intended.

image

I did found this (old) issue on the mux repo: gorilla/handlers#153 so it has been fixed in the past. Does anyone have a clue if the changes made could affect this somehow?

Copy link

dosubot bot commented Aug 21, 2024

Hey @Mattie112! I'm here to help you with any bugs, questions, or contributions you have. Let's tackle this issue together!

The issue you're encountering with double compression when using CompressHandler from the gorilla/handlers package after upgrading to version 1.20.1 of the Prometheus client library is due to the fact that the Prometheus client library now supports compression by default. This means that the /metrics endpoint will automatically compress the response if the client supports it, as indicated by the Accept-Encoding header.

In version 1.20.1, the Prometheus client library uses the HandlerOpts{DisableCompression: false} setting, which enables compression by default. When you wrap the router with CompressHandler, it detects the already compressed response and applies its own compression again, leading to double compression.

Here's a relevant code snippet from the Prometheus client library:

handler := InstrumentMetricHandler(reg, HandlerForTransactional(mReg, HandlerOpts{DisableCompression: false}))

To avoid this double compression, you can either disable the compression in the Prometheus client library or avoid using CompressHandler for the /metrics endpoint. Here are two possible solutions:

  1. Disable Compression in Prometheus Client Library:

    You can disable the default compression by setting DisableCompression to true in the HandlerOpts:

    router.Handle("/metrics", promhttp.HandlerFor(prometheus.DefaultGatherer, promhttp.HandlerOpts{DisableCompression: true}))
  2. Avoid Using CompressHandler for /metrics Endpoint:

    You can selectively apply CompressHandler to other routes but not to the /metrics endpoint:

    router := mux.NewRouter()
    router.Handle("/metrics", promhttp.Handler())
    // Apply CompressHandler to other routes if needed
    http.ListenAndServe(":8080", router)

By implementing one of these solutions, you should be able to avoid the double compression issue and have the /metrics endpoint return plain text as expected [1][2].

To continue talking to Dosu, mention @dosu.

@dosubot dosubot bot added the bug label Aug 21, 2024
@Mattie112
Copy link
Author

Just to confirm:

router.Handle("/metrics", promhttp.HandlerFor(prometheus.DefaultGatherer, promhttp.HandlerOpts{DisableCompression: true}))

Has no effect for me, it is still double compressed.

And the 2nd option for as far as I know is not possible. I can only add the ConpressHandler to the entire server.

(cool AI service btw!)

@ArthurSens
Copy link
Member

(cool AI service btw!)

We're giving it a try, looks useful as a quick first responder :)

It seems like desubot was going in the right direction. DisableCompression should disable the gzip compression, but I'm a bit disappointed that he suggested exactly the same thing after you said it didn't work.

I tried to reproduce the problem and confirmed! I think we have a bug related to #1496, although not clear to me what the bug is at first glance. When compression is disabled, the client defaults to no compression[1][2].

@bwplotka @mrueg, maybe you have an idea?

@ArthurSens
Copy link
Member

ArthurSens commented Aug 22, 2024

And the 2nd option for as far as I know is not possible. I can only add the ConpressHandler to the entire server.

I was able to spin up two webservers, one compressed and another without compression with the following snippet

package main

import (
	"log"
	"net/http"

	"github.com/prometheus/client_golang/prometheus"
	"github.com/prometheus/client_golang/prometheus/collectors"
	"github.com/prometheus/client_golang/prometheus/promhttp"

	"github.com/gorilla/handlers"
	"github.com/gorilla/mux"
)

func main() {
	// Create non-global registry.
	reg := prometheus.NewRegistry()

	// Add go runtime metrics and process collectors.
	reg.MustRegister(
		collectors.NewGoCollector(),
		collectors.NewProcessCollector(collectors.ProcessCollectorOpts{}),
	)

	// Expose handlers in separate goroutine.
	go func() {
		http.Handle("/metrics", promhttp.HandlerFor(reg, promhttp.HandlerOpts{Registry: reg, DisableCompression: true}))
		log.Fatal(http.ListenAndServe(":8080", nil))
	}()

	go func() {
		router := mux.NewRouter()
		router.Handle("/metrics", promhttp.HandlerFor(reg, promhttp.HandlerOpts{Registry: reg}))
		log.Fatal(http.ListenAndServe(":8081", handlers.CompressHandler(router)))
	}()

	select {}
}

@ArthurSens
Copy link
Member

Wow, desubot just erased his second comment after I said it was not a good comment 🤯

@bwplotka
Copy link
Member

bwplotka commented Aug 23, 2024

Thanks for report and investigation!

With v1.19 I could access the /metrics endpoint in my browser and it would work. Now with v1.20(.1) I get to see gzipped content.

I investigated a bit and it looks like the response is NOT double compressed, it works fine. Specifically gorilla/handlers#157 helps.

The only problem is that with the new promhttp version we correctly set the response HTTP content-encoding to identity when we don't compress.

image

This works incorrectly with gorilla compression layer, which compresses to gzip, but does NOT update response content encoding, which results in:

content-encoding:
identity
content-type:
text/plain; version=0.0.4; charset=utf-8; escaping=values
date:
Fri, 23 Aug 2024 09:42:08 GMT
transfer-encoding:
chunked
vary:
Accept-Encoding

..and then you see browser or clients weird behaviour.

Removing rsp.Header().Set(contentEncodingHeader, encodingHeader) results in all working correctly with gorilla compression:

content-encoding:
gzip
content-type:
text/plain; version=0.0.4; charset=utf-8; escaping=values
date:
Fri, 23 Aug 2024 09:49:15 GMT
transfer-encoding:
chunked
vary:
Accept-Encoding

Fix

This is actually our bug, not gorilla lib, because they cannot set headers AFTER response is being written (https://github.com/gorilla/handlers/blob/main/compress.go#L118). As a fix we need to:

  • NOT set "Content-Encoding" when we don't compress as before (regression), allowing others to cleanly do that for us and change content-encoding.
  • Additionally we could also error when "Content-Encoding" is already set in the writer response AND we are compressing our response (failing fast for bugs causing double compression.

@ArthurSens would like to do the fix? 🤗

@Mattie112
Copy link
Author

(cool AI service btw!)

We're giving it a try, looks useful as a quick first responder :)

It seems like desubot was going in the right direction. DisableCompression should disable the gzip compression, but I'm a bit disappointed that he suggested exactly the same thing after you said it didn't work.

I tried to reproduce the problem and confirmed! I think we have a bug related to #1496, although not clear to me what the bug is at first glance. When compression is disabled, the client defaults to no compression[1][2].

@bwplotka @mrueg, maybe you have an idea?

Yeah, I mean the first post was pretty OK. It was one thing that I did not test yet (although I was quite certain that it was a bug somewhere). But I think it is a nice addition. I did see another response indeed (even though I did not tag the bot). But now I don't see it. Perhaps it took your negative response as an indicator to delete the post? That is a bit strange perhaps?

And the 2nd option for as far as I know is not possible. I can only add the ConpressHandler to the entire server.

I was able to spin up two webservers, one compressed and another without compression with the following snippet

package main

import (
	"log"
	"net/http"

	"github.com/prometheus/client_golang/prometheus"
	"github.com/prometheus/client_golang/prometheus/collectors"
	"github.com/prometheus/client_golang/prometheus/promhttp"

	"github.com/gorilla/handlers"
	"github.com/gorilla/mux"
)

func main() {
	// Create non-global registry.
	reg := prometheus.NewRegistry()

	// Add go runtime metrics and process collectors.
	reg.MustRegister(
		collectors.NewGoCollector(),
		collectors.NewProcessCollector(collectors.ProcessCollectorOpts{}),
	)

	// Expose handlers in separate goroutine.
	go func() {
		http.Handle("/metrics", promhttp.HandlerFor(reg, promhttp.HandlerOpts{Registry: reg, DisableCompression: true}))
		log.Fatal(http.ListenAndServe(":8080", nil))
	}()

	go func() {
		router := mux.NewRouter()
		router.Handle("/metrics", promhttp.HandlerFor(reg, promhttp.HandlerOpts{Registry: reg}))
		log.Fatal(http.ListenAndServe(":8081", handlers.CompressHandler(router)))
	}()

	select {}
}

That indeed is an option but that would require me to make a code change to all my services (and we have quite a few) :)

Thanks for report and investigation!

With v1.19 I could access the /metrics endpoint in my browser and it would work. Now with v1.20(.1) I get to see gzipped content.

I investigated a bit and it looks like the response is NOT double compressed, it works fine. Specifically gorilla/handlers#157 helps.

The only problem is that with the new promhttp version we correctly set the response HTTP content-encoding to identity when we don't compress.

image

This works incorrectly with gorilla compression layer, which compresses to gzip, but does NOT update response content encoding, which results in:

content-encoding:
identity
content-type:
text/plain; version=0.0.4; charset=utf-8; escaping=values
date:
Fri, 23 Aug 2024 09:42:08 GMT
transfer-encoding:
chunked
vary:
Accept-Encoding

..and then you see browser or clients weird behaviour.

Removing rsp.Header().Set(contentEncodingHeader, encodingHeader) results in all working correctly with gorilla compression:

content-encoding:
gzip
content-type:
text/plain; version=0.0.4; charset=utf-8; escaping=values
date:
Fri, 23 Aug 2024 09:49:15 GMT
transfer-encoding:
chunked
vary:
Accept-Encoding

Fix

This is actually our bug, not gorilla lib, because they cannot set headers AFTER response is being written (https://github.com/gorilla/handlers/blob/main/compress.go#L118). As a fix we need to:

  • NOT set "Content-Encoding" when we don't compress as before (regression), allowing others to cleanly do that for us and change content-encoding.
  • Additionally we could also error when "Content-Encoding" is already set in the writer response AND we are compressing our response (failing fast for bugs causing double compression.

@ArthurSens would like to do the fix? 🤗

Thank you for your thorough investigation / explanation. Looking forward for a fix. I am really happy that I don't need to make code-changes to all our microservices :)

@mrueg
Copy link
Contributor

mrueg commented Aug 23, 2024

Thanks for the investigation @bwplotka and apologies for introducing an unintended bug here.

Reading through RFC2616, it seems like we should not set Content-Encoding: Identity at all.

   identity
        The default (identity) encoding; the use of no transformation
        whatsoever. This content-coding is used only in the Accept-
        Encoding header, and SHOULD NOT be used in the Content-Encoding
        header.

mrueg added a commit to mrueg/prometheus-client_golang that referenced this issue Aug 23, 2024
mrueg added a commit to mrueg/prometheus-client_golang that referenced this issue Aug 23, 2024
mrueg added a commit to mrueg/prometheus-client_golang that referenced this issue Aug 23, 2024
@ArthurSens
Copy link
Member

ArthurSens commented Aug 23, 2024

Fixed by #1596 and v1.20.2 is out with the fix

Thanks @mrueg for fixing it!

@Mattie112
Copy link
Author

Hi there,

Just to confirm: this new release fixed the issue for us. Many thanks for the quick response and fix!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants