Skip to content

Commit

Permalink
s3_object - fix MemoryError when downloading large files (#2108) (#2109)
Browse files Browse the repository at this point in the history
[PR #2108/48292178 backport][stable-8] s3_object - fix MemoryError when downloading large files

This is a backport of PR #2108 as merged into main (4829217).
SUMMARY
fixes: #2107
The refactor in #1139 is triggering a full download of the file into memory when downloading files, this downloaded content was then being thrown away.
ISSUE TYPE

Bugfix Pull Request

COMPONENT NAME
s3_object
ADDITIONAL INFORMATION

Reviewed-by: Mark Chappell
  • Loading branch information
patchback[bot] authored May 29, 2024
1 parent f139aa4 commit cc45aaa
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 3 deletions.
2 changes: 2 additions & 0 deletions changelogs/fragments/2107-s3_download.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
bugfixes:
- s3_object - fixed issue which was causing ``MemoryError`` exceptions when downloading large files (https://github.com/ansible-collections/amazon.aws/issues/2107).
3 changes: 0 additions & 3 deletions plugins/modules/s3_object.py
Original file line number Diff line number Diff line change
Expand Up @@ -783,9 +783,6 @@ def upload_s3file(
def download_s3file(module, s3, bucket, obj, dest, retries, version=None):
if module.check_mode:
module.exit_json(msg="GET operation skipped - running in check mode", changed=True)
# retries is the number of loops; range/xrange needs to be one
# more to get that count of loops.
_get_object_content(module, s3, bucket, obj, version)

optional_kwargs = {"ExtraArgs": {"VersionId": version}} if version else {}
for x in range(0, retries + 1):
Expand Down

0 comments on commit cc45aaa

Please sign in to comment.