-
-
Notifications
You must be signed in to change notification settings - Fork 908
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RuntimeError: dictionary changed size during iteration #945
Comments
Hello, do you have any news please? |
@LocutusOfBorg sorry for my lack update. I'm a little behind lastly because solving one specific hard issue slowed me down for some time, but I see the origin of this one and i hope to have it fixed with other ones till this week end. |
thanks a lot! don't worry! |
I'm seeing the same thing now on Fedora 27. Looks like the initial sync worked fine, but subsequent syncs with the same file cache generates this error.
|
One workaround might be to do a deepcopy of the outputted dictionary. My guess is that, since this is data backed up from a live server, some file was touched while the cached data is being verified. Doing a copy of the dictionary data should hopefully prevent this fs/iterator race condition. |
Same error here, FC27 too. I'm backing up static files. Nothing changed in the tree. |
workaround was to remove the file pointed too by --cache-file 2nd and 3rd syncs (after the cache is rebuilt) seem fine. |
Didn't stay fixed. Guess s3cmd sync is a gonner then, given aws cli has this feature now (minus the cache). |
Same problem on macOS too. |
It seems to me that the problem is a Python 3 behavior which is documented in this Stack Overflow post, along with the solution. It works reliably for me after changing HashCache.py's purge function so that, for example:
And on each subsequent "for" loop in purge(). I would guess that the mark_all_for_purge function would have the same issue. I tested on Python 3, but not 2. I'll try to get a pull request in with the changes, and some more testing. |
I also have this problem on Arch LTS, I've tried both the s3cmd in the Arch repos which is version '2.0.2-2' and also the latest version via pip which is version '2.0.2'. I have just switched from Debian Buster to Arch LTS (for other reasons) and s3cmd via pip was working just fine on Debian (if I recall Debian supports both Python 2 and 3). Arch is currently at Python 3.7.4 |
Doing the tuple () thing from NotTheDr01ds worked for me, I just made that change in HashCache.py where the backtrace showed it crashing. It's really gross and pip will break it next time I update but it worked. I have 13,000+ files I want to sync so it takes a few minutes if I can't use the cache file. |
…with --cache-file (Issue s3tools#945)
…with --cache-file (Issue s3tools#945)
…with --cache-file (Issue s3tools#945)
Fix merged thanks to @NotTheDr01ds ! |
Hello, I'm forwarding from bugs.debian.org/885914
The text was updated successfully, but these errors were encountered: