You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When running s3cmd to add headers, if the object has an existing header that has unicode characters it will fail.
We store user uploads with a hashed object name in a bucket and during upload we add a header for Content-Disposition: filename="originalFileName.jpg". So if the user downloads the image later it gives them the original file name and not the ugly hashed object name.
I am attempting to add Cache-Control to all objects in the bucket and discovered that if you have a unicode character already in the headers (in our case in the originalFileName.jpg) it will fail with the decode error.
Is there a workaround or is this a known issue?
The text was updated successfully, but these errors were encountered:
@dzollinger Can you test the latest master? I just pushed some fixes that I think will solve your issue.
With that, having unicode in custom headers will not trigger anymore errors.
btw, i did notice that s3cmd has a "content-disposition" argument that is not really used except for the command to generate a "signed url". As an enhancement, it could be good to improve this option to be able to create the good content-disposition value with "filename*=". but that is not the case currently.
When running s3cmd to add headers, if the object has an existing header that has unicode characters it will fail.
We store user uploads with a hashed object name in a bucket and during upload we add a header for Content-Disposition: filename="originalFileName.jpg". So if the user downloads the image later it gives them the original file name and not the ugly hashed object name.
I am attempting to add Cache-Control to all objects in the bucket and discovered that if you have a unicode character already in the headers (in our case in the originalFileName.jpg) it will fail with the decode error.
Is there a workaround or is this a known issue?
The text was updated successfully, but these errors were encountered: