You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I upload a file to S3 with, e.g., Content-Type: application/json;charset=utf-8 and Content-Encoding: gzip, and then later download it with aws s3 cp, I want the version I download to be automatically decompressed based on the Content-Encoding field. This would match what happens from the AWS web interface. Since this would change existing behavior, it could be gated with a flag. Otherwise, not only do I have to decompress myself based on the content encoding (which I may have to do extra work to determine), but I also have the confusing situation of a file with extension .json that really should be .json.gz (but if I name it .json.gz, then I'm tying the file name to the encoding, which isn't what I want for transparent decompression). As for having the decompression utility available, gzip is installed on Mac and I'd imagine it is on Linux by default as well. Perhaps it could decompress gzipped files as long as that utility is available, or else fail entirely.
The text was updated successfully, but these errors were encountered:
Hi @michaeleisel thanks for reaching out. This request overlaps with #1131 and boto/boto3#1257. This is also being tracked here in the botocore repository: boto/botocore#1255. We are tracking the request in botocore because it would apply to both the CLI and boto3. You can 👍 that issue to show your support or add a comment if you wanted to share any more information on your use case or proposed implementation.
Comments on closed issues are hard for our team to see.
If you need more assistance, please open a new issue that references this one. If you wish to keep having a conversation with other community members under this issue feel free to do so.
When I upload a file to S3 with, e.g.,
Content-Type: application/json;charset=utf-8
andContent-Encoding: gzip
, and then later download it withaws s3 cp
, I want the version I download to be automatically decompressed based on theContent-Encoding
field. This would match what happens from the AWS web interface. Since this would change existing behavior, it could be gated with a flag. Otherwise, not only do I have to decompress myself based on the content encoding (which I may have to do extra work to determine), but I also have the confusing situation of a file with extension.json
that really should be.json.gz
(but if I name it.json.gz
, then I'm tying the file name to the encoding, which isn't what I want for transparent decompression). As for having the decompression utility available,gzip
is installed on Mac and I'd imagine it is on Linux by default as well. Perhaps it could decompress gzipped files as long as that utility is available, or else fail entirely.The text was updated successfully, but these errors were encountered: