Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No brotli in zlib #267

Closed
bmansfie opened this issue Jul 6, 2022 · 58 comments · Fixed by #10722
Closed

No brotli in zlib #267

bmansfie opened this issue Jul 6, 2022 · 58 comments · Fixed by #10722
Labels
bug Something isn't working node.js Compatibility with Node.js APIs

Comments

@bmansfie
Copy link

bmansfie commented Jul 6, 2022

This issue has been fixed as of Bun v1.1.8 in https://bun.sh/blog/bun-v1.1.8

Original issue below


The zlib implementation available doesn't support brotli. It's also significantly different than the current node.js structure of the library.

@Jarred-Sumner Jarred-Sumner added bug Something isn't working node.js Compatibility with Node.js APIs labels Jul 7, 2022
@Jarred-Sumner
Copy link
Collaborator

Bun doesn't currently include a brotli implementation, but it really should

@Kyza
Copy link

Kyza commented Jul 16, 2022

The Node API of zlib isn't very well structured or documented. A better version API being included in Bun with the Node version of it just for compatibility would be great.

The helper function definition here shows the options, but then has a params object inside of that which has the keys as constants from zlib.constants. And there are a lot of constants defined there.

@Lutymane
Copy link

Any progress for Brotli support?

@marcospassos
Copy link

This is the only thing that prevents us from using Bun :/

@bmansfie
Copy link
Author

We're in the same boat, this is the last thing lacking on the compatibility list.

@jefer94
Copy link

jefer94 commented Nov 28, 2023

@bmansfie do you work for Bun? Because this should be good news for me, I can't even try to make the binding by myself because event the @axios team didn't say anything that could help me to solve this issue, I'm so disappointed about how both teams has been managing this issue during two years

If you don't be a staff member of Bun, should someone tell me what's the requirements to be met? Or if you have a lib chosen to be bound to Bun now, will be great to write these stuffs in an issue, otherwise I should to deal with it manually

@hker9527
Copy link

@bmansfie do you work for Bun? Because this should be good news for me, I can't even try to make the binding by myself because event the @axios team didn't say anything that could help me to solve this issue, I'm so disappointed about how both teams has been managing this issue during two years

If you don't be a staff member of Bun, should someone tell me what's the requirements to be met? Or if you have a lib chosen to be bound to Bun now, will be great to write these stuffs in an issue, otherwise I should to deal with it manually

This line determines whether the current environment supports brotli:

const isBrotliSupported = utils.isFunction(zlib.createBrotliDecompress);

The current (imperfect) workaround is to manually edit this line in file node_modules/axios/dist/node/axios.cjs to tell axios not to use brotli.

Not a clean solution, but works for me.

@jefer94
Copy link

jefer94 commented Dec 2, 2023

Hey mister @colinhacks @paperdave @Jarred-Sumner @antongolub I was reading that Brotli is implemented, but it's unoptimized, what you know about this? I should end this task

@jefer94
Copy link

jefer94 commented Dec 2, 2023

Is this optimization issue related to #6299?

@buravlev-arthur
Copy link

buravlev-arthur commented Dec 5, 2023

@bmansfie do you work for Bun? Because this should be good news for me, I can't even try to make the binding by myself because event the @axios team didn't say anything that could help me to solve this issue, I'm so disappointed about how both teams has been managing this issue during two years
If you don't be a staff member of Bun, should someone tell me what's the requirements to be met? Or if you have a lib chosen to be bound to Bun now, will be great to write these stuffs in an issue, otherwise I should to deal with it manually

This line determines whether the current environment supports brotli:

const isBrotliSupported = utils.isFunction(zlib.createBrotliDecompress);

The current (imperfect) workaround is to manually edit this line in file node_modules/axios/dist/node/axios.cjs to tell axios not to use brotli.

Not a clean solution, but works for me.

You can disable default axios's compressing and set option responseType to arraybuffer. Also demand only gzip compression from a server (header 'Accept-Encoding': 'gzip'). Then parse buffer data to json, text or html by zlib.gunzip(). Here is code example.

@jefer94
Copy link

jefer94 commented Dec 5, 2023

I would rather any other compression algorithm supported in caniuse.com

import pickle
import requests
import timeit
import sys
import zlib
import zstandard as zstd
import brotli

url = 'https://your.url/here'

# Assuming 'data' holds the information you want to compress
response = requests.get(url)
data = response.json()
serialized_data = pickle.dumps(data)


# Benchmarking function
def benchmark(function, iterations=10):
    r = timeit.repeat(function, repeat=3, number=iterations)
    return sum(r) / len(r)  # Average time per iteration


# Benchmark serialization and compression for Gzip across different levels
gzip_results = {}
for level in range(1, 10):
    compressed_data = zlib.compress(serialized_data, level)
    compression_ratio = (sys.getsizeof(compressed_data) / sys.getsizeof(serialized_data)) * 100
    compressed_size = sys.getsizeof(compressed_data)
    gzip_results[level] = {
        'compression_time': benchmark(lambda: zlib.compress(serialized_data, level), iterations=5),
        'decompression_time': benchmark(lambda: zlib.decompress(compressed_data), iterations=5),
        'compression_ratio': compression_ratio,
        'compressed_size': compressed_size
    }

# Benchmark serialization and compression for Deflate across different levels
deflate_results = {}
for level in range(1, 10):
    compressed_data = zlib.compress(serialized_data, level)
    compression_ratio = (sys.getsizeof(compressed_data) / sys.getsizeof(serialized_data)) * 100
    compressed_size = sys.getsizeof(compressed_data)
    deflate_results[level] = {
        'compression_time': benchmark(lambda: zlib.compress(serialized_data, level), iterations=5),
        'decompression_time': benchmark(lambda: zlib.decompress(compressed_data), iterations=5),
        'compression_ratio': compression_ratio,
        'compressed_size': compressed_size
    }

# Benchmark serialization and compression for Zstandard across different levels
zstd_results = {}
for level in range(1, 23):
    cctx = zstd.ZstdCompressor(level=level)
    compressed_data = cctx.compress(serialized_data)
    compression_ratio = (sys.getsizeof(compressed_data) / sys.getsizeof(serialized_data)) * 100
    compressed_size = sys.getsizeof(compressed_data)
    zstd_results[level] = {
        'compression_time': benchmark(lambda: cctx.compress(serialized_data), iterations=5),
        'decompression_time': benchmark(lambda: zstd.ZstdDecompressor().decompress(compressed_data),
                                        iterations=5),
        'compression_ratio': compression_ratio,
        'compressed_size': compressed_size
    }

# Benchmark serialization and compression for Brotli across different quality levels
brotli_results = {}
for quality in range(1, 12):
    compressed_data = brotli.compress(serialized_data, quality=quality)
    compression_ratio = (sys.getsizeof(compressed_data) / sys.getsizeof(serialized_data)) * 100
    compressed_size = sys.getsizeof(compressed_data)
    brotli_results[quality] = {
        'compression_time': benchmark(lambda: brotli.compress(serialized_data, quality=quality),
                                      iterations=5),
        'decompression_time': benchmark(lambda: brotli.decompress(compressed_data), iterations=5),
        'compression_ratio': compression_ratio,
        'compressed_size': compressed_size
    }

# Print benchmark results for Gzip
print("Gzip Serialization (compression/decompression) results:")
for level, result in gzip_results.items():
    print(f"Gzip (Level {level}):")
    print(f"  Compression Time: {result['compression_time']:.6f} seconds")
    print(f"  Decompression Time: {result['decompression_time']:.6f} seconds")
    print(f"  Compression Ratio: {result['compression_ratio']:.2f}%")
    print(f"  Compressed Size: {result['compressed_size']} bytes\n")

# Print benchmark results for Deflate
print("\nDeflate Serialization (compression/decompression) results:")
for level, result in deflate_results.items():
    print(f"Deflate (Level {level}):")
    print(f"  Compression Time: {result['compression_time']:.6f} seconds")
    print(f"  Decompression Time: {result['decompression_time']:.6f} seconds")
    print(f"  Compression Ratio: {result['compression_ratio']:.2f}%")
    print(f"  Compressed Size: {result['compressed_size']} bytes\n")

# Print benchmark results for Zstandard
print("\nZstandard Serialization (compression/decompression) results:")
for level, result in zstd_results.items():
    print(f"Zstandard (Level {level}):")
    print(f"  Compression Time: {result['compression_time']:.6f} seconds")
    print(f"  Decompression Time: {result['decompression_time']:.6f} seconds")
    print(f"  Compression Ratio: {result['compression_ratio']:.2f}%")
    print(f"  Compressed Size: {result['compressed_size']} bytes\n")

# Print benchmark results for Brotli
print("\nBrotli Serialization (compression/decompression) results:")
for quality, result in brotli_results.items():
    print(f"Brotli (Quality {quality}):")
    print(f"  Compression Time: {result['compression_time']:.6f} seconds")
    print(f"  Decompression Time: {result['decompression_time']:.6f} seconds")
    print(f"  Compression Ratio: {result['compression_ratio']:.2f}%")
    print(f"  Compressed Size: {result['compressed_size']} bytes\n")
Gzip Serialization (compression/decompression) results:
Gzip (Level 1):
  Compression Time: 0.131746 seconds
  Decompression Time: 0.046795 seconds
  Compression Ratio: 36.89%
  Compressed Size: 1167235 bytes

Gzip (Level 2):
  Compression Time: 0.148521 seconds
  Decompression Time: 0.044622 seconds
  Compression Ratio: 35.26%
  Compressed Size: 1115833 bytes

Gzip (Level 3):
  Compression Time: 0.192732 seconds
  Decompression Time: 0.046630 seconds
  Compression Ratio: 33.97%
  Compressed Size: 1074807 bytes

Gzip (Level 4):
  Compression Time: 0.213282 seconds
  Decompression Time: 0.047072 seconds
  Compression Ratio: 32.40%
  Compressed Size: 1025204 bytes

Gzip (Level 5):
  Compression Time: 0.348723 seconds
  Decompression Time: 0.046816 seconds
  Compression Ratio: 31.36%
  Compressed Size: 992237 bytes

Gzip (Level 6):
  Compression Time: 0.493173 seconds
  Decompression Time: 0.045413 seconds
  Compression Ratio: 30.97%
  Compressed Size: 980071 bytes

Gzip (Level 7):
  Compression Time: 0.541998 seconds
  Decompression Time: 0.043903 seconds
  Compression Ratio: 30.92%
  Compressed Size: 978442 bytes

Gzip (Level 8):
  Compression Time: 0.609642 seconds
  Decompression Time: 0.044365 seconds
  Compression Ratio: 30.90%
  Compressed Size: 977862 bytes

Gzip (Level 9):
  Compression Time: 0.589004 seconds
  Decompression Time: 0.042475 seconds
  Compression Ratio: 30.90%
  Compressed Size: 977862 bytes


Deflate Serialization (compression/decompression) results:
Deflate (Level 1):
  Compression Time: 0.128636 seconds
  Decompression Time: 0.046544 seconds
  Compression Ratio: 36.89%
  Compressed Size: 1167235 bytes

Deflate (Level 2):
  Compression Time: 0.148510 seconds
  Decompression Time: 0.043960 seconds
  Compression Ratio: 35.26%
  Compressed Size: 1115833 bytes

Deflate (Level 3):
  Compression Time: 0.191041 seconds
  Decompression Time: 0.043908 seconds
  Compression Ratio: 33.97%
  Compressed Size: 1074807 bytes

Deflate (Level 4):
  Compression Time: 0.197751 seconds
  Decompression Time: 0.044402 seconds
  Compression Ratio: 32.40%
  Compressed Size: 1025204 bytes

Deflate (Level 5):
  Compression Time: 0.309715 seconds
  Decompression Time: 0.043225 seconds
  Compression Ratio: 31.36%
  Compressed Size: 992237 bytes

Deflate (Level 6):
  Compression Time: 0.478999 seconds
  Decompression Time: 0.041243 seconds
  Compression Ratio: 30.97%
  Compressed Size: 980071 bytes

Deflate (Level 7):
  Compression Time: 0.523563 seconds
  Decompression Time: 0.041796 seconds
  Compression Ratio: 30.92%
  Compressed Size: 978442 bytes

Deflate (Level 8):
  Compression Time: 0.591571 seconds
  Decompression Time: 0.044189 seconds
  Compression Ratio: 30.90%
  Compressed Size: 977862 bytes

Deflate (Level 9):
  Compression Time: 0.591415 seconds
  Decompression Time: 0.043805 seconds
  Compression Ratio: 30.90%
  Compressed Size: 977862 bytes


Zstandard Serialization (compression/decompression) results:
Zstandard (Level 1):
  Compression Time: 0.014630 seconds
  Decompression Time: 0.004104 seconds
  Compression Ratio: 13.06%
  Compressed Size: 413334 bytes

Zstandard (Level 2):
  Compression Time: 0.011026 seconds
  Decompression Time: 0.002880 seconds
  Compression Ratio: 7.55%
  Compressed Size: 238832 bytes

Zstandard (Level 3):
  Compression Time: 0.010607 seconds
  Decompression Time: 0.002330 seconds
  Compression Ratio: 5.35%
  Compressed Size: 169132 bytes

Zstandard (Level 4):
  Compression Time: 0.010927 seconds
  Decompression Time: 0.002252 seconds
  Compression Ratio: 5.30%
  Compressed Size: 167705 bytes

Zstandard (Level 5):
  Compression Time: 0.022500 seconds
  Decompression Time: 0.002075 seconds
  Compression Ratio: 4.75%
  Compressed Size: 150148 bytes

Zstandard (Level 6):
  Compression Time: 0.030029 seconds
  Decompression Time: 0.002032 seconds
  Compression Ratio: 4.44%
  Compressed Size: 140424 bytes

Zstandard (Level 7):
  Compression Time: 0.035606 seconds
  Decompression Time: 0.001970 seconds
  Compression Ratio: 4.38%
  Compressed Size: 138676 bytes

Zstandard (Level 8):
  Compression Time: 0.042404 seconds
  Decompression Time: 0.001982 seconds
  Compression Ratio: 4.30%
  Compressed Size: 135926 bytes

Zstandard (Level 9):
  Compression Time: 0.041453 seconds
  Decompression Time: 0.001819 seconds
  Compression Ratio: 3.97%
  Compressed Size: 125507 bytes

Zstandard (Level 10):
  Compression Time: 0.055232 seconds
  Decompression Time: 0.001803 seconds
  Compression Ratio: 3.91%
  Compressed Size: 123756 bytes

Zstandard (Level 11):
  Compression Time: 0.067030 seconds
  Decompression Time: 0.001763 seconds
  Compression Ratio: 3.88%
  Compressed Size: 122782 bytes

Zstandard (Level 12):
  Compression Time: 0.074708 seconds
  Decompression Time: 0.001779 seconds
  Compression Ratio: 3.88%
  Compressed Size: 122773 bytes

Zstandard (Level 13):
  Compression Time: 0.320558 seconds
  Decompression Time: 0.001800 seconds
  Compression Ratio: 3.86%
  Compressed Size: 122225 bytes

Zstandard (Level 14):
  Compression Time: 0.514192 seconds
  Decompression Time: 0.001885 seconds
  Compression Ratio: 3.82%
  Compressed Size: 120726 bytes

Zstandard (Level 15):
  Compression Time: 0.776974 seconds
  Decompression Time: 0.001669 seconds
  Compression Ratio: 3.78%
  Compressed Size: 119665 bytes

Zstandard (Level 16):
  Compression Time: 0.745292 seconds
  Decompression Time: 0.001817 seconds
  Compression Ratio: 3.72%
  Compressed Size: 117706 bytes

Zstandard (Level 17):
  Compression Time: 0.882279 seconds
  Decompression Time: 0.001811 seconds
  Compression Ratio: 3.68%
  Compressed Size: 116327 bytes

Zstandard (Level 18):
  Compression Time: 0.785868 seconds
  Decompression Time: 0.001999 seconds
  Compression Ratio: 3.65%
  Compressed Size: 115567 bytes

Zstandard (Level 19):
  Compression Time: 1.235092 seconds
  Decompression Time: 0.001685 seconds
  Compression Ratio: 3.62%
  Compressed Size: 114656 bytes

Zstandard (Level 20):
  Compression Time: 1.140070 seconds
  Decompression Time: 0.001782 seconds
  Compression Ratio: 3.62%
  Compressed Size: 114656 bytes

Zstandard (Level 21):
  Compression Time: 1.689877 seconds
  Decompression Time: 0.001728 seconds
  Compression Ratio: 3.62%
  Compressed Size: 114409 bytes

Zstandard (Level 22):
  Compression Time: 2.818275 seconds
  Decompression Time: 0.001952 seconds
  Compression Ratio: 3.61%
  Compressed Size: 114149 bytes


Brotli Serialization (compression/decompression) results:
Brotli (Quality 1):
  Compression Time: 0.031643 seconds
  Decompression Time: 0.018773 seconds
  Compression Ratio: 17.61%
  Compressed Size: 557157 bytes

Brotli (Quality 2):
  Compression Time: 0.025418 seconds
  Decompression Time: 0.007376 seconds
  Compression Ratio: 5.78%
  Compressed Size: 183002 bytes

Brotli (Quality 3):
  Compression Time: 0.028109 seconds
  Decompression Time: 0.006453 seconds
  Compression Ratio: 5.55%
  Compressed Size: 175494 bytes

Brotli (Quality 4):
  Compression Time: 0.040346 seconds
  Decompression Time: 0.005264 seconds
  Compression Ratio: 4.38%
  Compressed Size: 138659 bytes

Brotli (Quality 5):
  Compression Time: 0.071548 seconds
  Decompression Time: 0.006076 seconds
  Compression Ratio: 4.03%
  Compressed Size: 127586 bytes

Brotli (Quality 6):
  Compression Time: 0.080487 seconds
  Decompression Time: 0.005724 seconds
  Compression Ratio: 3.89%
  Compressed Size: 123166 bytes

Brotli (Quality 7):
  Compression Time: 0.090228 seconds
  Decompression Time: 0.005666 seconds
  Compression Ratio: 3.82%
  Compressed Size: 120802 bytes

Brotli (Quality 8):
  Compression Time: 0.100304 seconds
  Decompression Time: 0.005359 seconds
  Compression Ratio: 3.77%
  Compressed Size: 119300 bytes

Brotli (Quality 9):
  Compression Time: 0.158134 seconds
  Decompression Time: 0.005358 seconds
  Compression Ratio: 3.74%
  Compressed Size: 118328 bytes

Brotli (Quality 10):
  Compression Time: 1.688359 seconds
  Decompression Time: 0.005695 seconds
  Compression Ratio: 3.51%
  Compressed Size: 111109 bytes

Brotli (Quality 11):
  Compression Time: 5.849162 seconds
  Decompression Time: 0.006769 seconds
  Compression Ratio: 3.74%
  Compressed Size: 118307 bytes

I'm surprised that Zstandard with level 22 is closer than Brotli with a quality of 11, I did other benchmarks with algorithms in theory I should have 3.xx% of compression ratio but for any reason I can't replicate that results, Bun already support gzip, but with those steps you should implement another algorithm or let Bun with Axios deal with this responsability

@M-Gonzalo
Copy link

How is bun announcing v1.x.x given that its stated purpose is to be a nodejs drop-in replacement and there's still hundreds of breaking issues like this one? Feels misleading at best

@jefer94
Copy link

jefer94 commented Dec 11, 2023

Actually the only problem here is they have not documented what are the goals to be met before re-enable Brotli

@itsjavi
Copy link

itsjavi commented Dec 13, 2023

For anyone here having the issues with axios, here is a quick workaround to make Axios use gzip instead of Brotli:

axios.defaults.headers.common["Accept-Encoding"] = "gzip";

Then all your axios instances will use gzip.

Not ideal, but that's the only way I found to make it work with Bun.

@naderslr
Copy link

For anyone here having the issues with axios, here is a quick workaround to make Axios use gzip instead of Brotli:

axios.defaults.headers.common["Accept-Encoding"] = "gzip";

Then all your axios instances will use gzip.

Not ideal, but that's the only way I found to make it work with Bun.

You are the best!!! Been struggling for a while.

@xiaoxiunique
Copy link

So for this problem that has existed for almost 2 years, there is no solution?

@Jarred-Sumner
Copy link
Collaborator

So for this problem that has existed for almost 2 years, there is no solution?

You should expect this issue to be fixed in the next 2 months. We are really focused on Windows right now. I have a branch that mostly implements this already, but we cannot prioritize finishing it until Windows + more bugs are fixed.

@xiaoxiunique
Copy link

So for this problem that has existed for almost 2 years, there is no solution?

You should expect this issue to be fixed in the next 2 months. We are really focused on Windows right now. I have a branch that mostly implements this already, but we cannot prioritize finishing it until Windows + more bugs are fixed.

got it. thanks

@zolero
Copy link

zolero commented Apr 16, 2024

@paulGeoghegan let me give you the solution! We got stuck here as well but managed to just patch the axios JS files to remove Brotli.

What you should do is install custompatch. More information here: https://github.com/tmcdos/custompatch

Go to axios in node_modules and apply this patch. Search for the lines in out patch file:

Index: /axios/dist/node/axios.cjs
===================================================================
--- /axios/dist/node/axios.cjs
+++ /axios/dist/node/axios.cjs
@@ -2873,9 +2873,9 @@
     }
 
     headers.set(
       'Accept-Encoding',
-      'gzip, compress, deflate' + (isBrotliSupported ? ', br' : ''), false
+      'gzip, compress, deflate', false
       );
 
     const options = {
       path,
Index: /axios/lib/adapters/http.js
===================================================================
--- /axios/lib/adapters/http.js
+++ /axios/lib/adapters/http.js
@@ -34,9 +34,9 @@
   flush: zlib.constants.BROTLI_OPERATION_FLUSH,
   finishFlush: zlib.constants.BROTLI_OPERATION_FLUSH
 }
 
-const isBrotliSupported = utils.isFunction(zlib.createBrotliDecompress);
+const isBrotliSupported = false;
 
 const {http: httpFollow, https: httpsFollow} = followRedirects;
 
 const isHttps = /https:?/;

If you've added these changes, do:
npx custompatch axios

this will create the patch files.

In the end add this, so everytime the package is going to install it will apply the patch afterwards:
Add this to your package.json

"scripts": {
   ...others
    "postinstall": "bunx custompatch"
  },

@paulGeoghegan
Copy link

@zolero you have saved me! thank you so much. I was preparing to switch back to Node when I got your email. That's a fantastic package and I can't believe I haven't heard of it before. So many times I have just wanted to be able to apply a patch to a package but it is usually a lot of work. Again thank you because I love using Bun and it would have been a shame.

@Grandnainconnu
Copy link

Grandnainconnu commented Apr 22, 2024

@paulGeoghegan let me give you the solution! We got stuck here as well but managed to just patch the axios JS files to remove Brotli.

What you should do is install custompatch. More information here: https://github.com/tmcdos/custompatch

Go to axios in node_modules and apply this patch. Search for the lines in out patch file:

Index: /axios/dist/node/axios.cjs
===================================================================
--- /axios/dist/node/axios.cjs
+++ /axios/dist/node/axios.cjs
@@ -2873,9 +2873,9 @@
     }
 
     headers.set(
       'Accept-Encoding',
-      'gzip, compress, deflate' + (isBrotliSupported ? ', br' : ''), false
+      'gzip, compress, deflate', false
       );
 
     const options = {
       path,
Index: /axios/lib/adapters/http.js
===================================================================
--- /axios/lib/adapters/http.js
+++ /axios/lib/adapters/http.js
@@ -34,9 +34,9 @@
   flush: zlib.constants.BROTLI_OPERATION_FLUSH,
   finishFlush: zlib.constants.BROTLI_OPERATION_FLUSH
 }
 
-const isBrotliSupported = utils.isFunction(zlib.createBrotliDecompress);
+const isBrotliSupported = false;
 
 const {http: httpFollow, https: httpsFollow} = followRedirects;
 
 const isHttps = /https:?/;

If you've added these changes, do: npx custompatch axios

this will create the patch files.

In the end add this, so everytime the package is going to install it will apply the patch afterwards: Add this to your package.json

"scripts": {
   ...others
    "postinstall": "bunx custompatch"
  },

This works only if you have node installed, though. Otherwise you end up with:

root@176c5bb96fbf:~/.bun/install/global/node_modules# bunx custompatch
2 | const MiniPass = require('minipass')
3 | const EE = require('events').EventEmitter
4 | const fs = require('fs')
5 | 
6 | // for writev
7 | const binding = process.binding('fs')
                    ^
error: process.binding("fs") is not implemented in Bun. Track the status & thumbs up the issue: https://github.com/oven-sh/bun/issues/3546

From the bun image with docker, the custompatch always fails.

PS: Was able to acheive the same result with the Docker image following #2336

@paulGeoghegan
Copy link

@Grandnainconnu yeah I know. I had to stop using my dev container and just install BUN for windows. I'm actually having trouble with the patches but it has fixed my issue for now. It's not ideal but at least now I'm not stuck.

@iamacup
Copy link

iamacup commented Apr 28, 2024

Just for visibility: #7248 - looks like something may have changed in a recent version of bunjs that has made this issue become more problematic now, in case this helps anyone from google.

@kingkong404
Copy link

Just for visibility: #7248 - looks like something may have changed in a recent version of bunjs that has made this issue become more problematic now, in case this helps anyone from google.

Can confirm something in recent versions of Bun has caused this to break

@samyarkd
Copy link

samyarkd commented May 5, 2024

axois is used in a lot of dependencies this should be fixed soon.

@shimizudev
Copy link

There's a pull request for it; I hope it gets merged soon so that we'll have Brotli in the next version of Bun.

@Jarred-Sumner
Copy link
Collaborator

Jarred-Sumner commented May 7, 2024

Bun v1.1.8 will include support for Brotli in node:zlib, thanks to @nektro. Bun v1.1.8 will release Wednesday.

In about an hour, you can try it out via

bun upgrade --canary

@shimizudev
Copy link

HUGEEEE

@touhidurrr
Copy link

does this include zstd too?

@M-Gonzalo
Copy link

does this include zstd too?

it shouldn't. They're separate issues. Also, zstd is not a de facto web standard like brotli is, no matter how nice the algorithm / default implementation might be.

This was referenced May 8, 2024
@touhidurrr
Copy link

touhidurrr commented May 8, 2024

it shouldn't. They're separate issues. Also, zstd is not a de facto web standard like brotli is, no matter how nice the algorithm / default implementation might be.

@M-Gonzalo I wanted to know if I still have to do this header modification patch on axios for safety as mentioned in #267 (comment). As default accept encoding header of axios includes zstd.

@Jarred-Sumner
Copy link
Collaborator

zstd

Where do you see this in Axios' codebase?

@M-Gonzalo
Copy link

it shouldn't. They're separate issues. Also, zstd is not a de facto web standard like brotli is, no matter how nice the algorithm / default implementation might be.

@M-Gonzalo I wanted to know if I still have to do this header modification patch on axios for safety as mentioned in #267 (comment). As default accept encoding header of axios includes zstd.

I don't know what to say about that. There isn't a single mention of zstd in axios' source code:
image

@touhidurrr
Copy link

touhidurrr commented May 8, 2024

zstd

Where do you see this in Axios' codebase?

in #267 (comment):

Normally, this would be

Accept-Encoding: gzip, deflate, br, zstd

So, I thought it was the default behaviour. However, you are right. It seems like there is no default Accept-Encoding header. I couldn’t find any such thing in the docs either.

axios default headers test

@nektro
Copy link
Member

nektro commented May 8, 2024

Bun.serve({
  fetch(req) {
    return new Response(JSON.stringify(req.headers));
  },
});
> axios.get("http://localhost:3000").then(x => console.log(x.data))
> {
  accept: 'application/json, text/plain, */*',
  'user-agent': 'axios/1.6.8',
  'accept-encoding': 'gzip, compress, deflate, br',
  host: 'localhost:3000',
  connection: 'keep-alive'
}

jinzishuai pushed a commit to aitok-ai/LibreChat that referenced this issue May 20, 2024
* fix(bun): fix bun compatibility to allow gzip header: oven-sh/bun#267 (comment)

* chore: update custom config examples

* fix(OpenAIClient.chatCompletion): remove redundant call of stream.controller.abort() as `break` aborts the request and prevents abort errors when not called redundantly

* chore: bump bun.lockb

* fix: remove result-thinking class when message is no longer streaming

* fix(bun): improve Bun support by forcing use of old method in bun env, also update old methods with new customizable params

* fix(ci): pass tests
@parwiwisaw
Copy link

This shows up in google search so posting this.

It was fixed in https://bun.sh/blog/bun-v1.1.8

I am using 1.1.17 and i don't see the issue after upgrading from 1.1

BertKiv pushed a commit to BertKiv/LibreChat that referenced this issue Dec 10, 2024
* fix(bun): fix bun compatibility to allow gzip header: oven-sh/bun#267 (comment)

* chore: update custom config examples

* fix(OpenAIClient.chatCompletion): remove redundant call of stream.controller.abort() as `break` aborts the request and prevents abort errors when not called redundantly

* chore: bump bun.lockb

* fix: remove result-thinking class when message is no longer streaming

* fix(bun): improve Bun support by forcing use of old method in bun env, also update old methods with new customizable params

* fix(ci): pass tests
@ivanfernandez2646
Copy link

Hi @Jarred-Sumner, please could you add support to decompress Brotli from streams. The method createBrotliDecompress from zlib, returns an object extends from stream.Transform for NodeJS.

**interface BrotliDecompress extends stream.Transform, Zlib {}**

Since Bun does not support stream.Transform and relies in a proper API for streams... could you provide support for the method and return an instance from TransformStream.

Many thanks!!

klenovova added a commit to klenovova/TS_Librechat that referenced this issue Dec 24, 2024
* fix(bun): fix bun compatibility to allow gzip header: oven-sh/bun#267 (comment)

* chore: update custom config examples

* fix(OpenAIClient.chatCompletion): remove redundant call of stream.controller.abort() as `break` aborts the request and prevents abort errors when not called redundantly

* chore: bump bun.lockb

* fix: remove result-thinking class when message is no longer streaming

* fix(bun): improve Bun support by forcing use of old method in bun env, also update old methods with new customizable params

* fix(ci): pass tests
Mycolapro added a commit to Mycolapro/TS_Chat that referenced this issue Dec 24, 2024
* fix(bun): fix bun compatibility to allow gzip header: oven-sh/bun#267 (comment)

* chore: update custom config examples

* fix(OpenAIClient.chatCompletion): remove redundant call of stream.controller.abort() as `break` aborts the request and prevents abort errors when not called redundantly

* chore: bump bun.lockb

* fix: remove result-thinking class when message is no longer streaming

* fix(bun): improve Bun support by forcing use of old method in bun env, also update old methods with new customizable params

* fix(ci): pass tests
Maxtop-pro added a commit to Maxtop-pro/TypeScript_Chat that referenced this issue Dec 24, 2024
* fix(bun): fix bun compatibility to allow gzip header: oven-sh/bun#267 (comment)

* chore: update custom config examples

* fix(OpenAIClient.chatCompletion): remove redundant call of stream.controller.abort() as `break` aborts the request and prevents abort errors when not called redundantly

* chore: bump bun.lockb

* fix: remove result-thinking class when message is no longer streaming

* fix(bun): improve Bun support by forcing use of old method in bun env, also update old methods with new customizable params

* fix(ci): pass tests
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working node.js Compatibility with Node.js APIs
Projects
None yet
Development

Successfully merging a pull request may close this issue.