This repository has been archived by the owner on Jun 23, 2024. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 39
[FR] Add a check for max_num
#29
Comments
There is currently no way to clean it up (other than manually) |
Thanks, so at least I have not overlooked it then. The relevant code would be in max_num = self._config.max_num
if track.max_num is not None:
max_num = track.max_num
if len(update_json.versions) > max_num:
old_item = update_json.versions.pop(0)
for path in module_folder.glob(f"*{old_item.versionCode}*"):
if not path.is_file():
continue
self._log.d(f"_update_jsons: [{track.id}] -> remove {path.name}")
path.unlink()
track.last_update = timestamp
track.versions = len(update_json.versions)
update_json.write(update_json_file)
track.write(track_json_file) I see you already marked it "not planned" unfortunately; but this could probably be addressed by moving this part to its own |
I will add this to the |
max_num
Thanks! Tested, works as expected 🤩 |
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
After having run
cli.py sync
multiple times now, there are still 3 zips in this module's directory. Maybe older versions only get removed once an update was fetched (I will have to wait for that), but I'd expectsync
to take care for that when the value has been changed (especially whenversions > max_num
). If you consider that "works as designed", what would be the proper way to clean up (apart from manually changing the files with the risk of "messing up")? Sometimes "waiting for a new update" might be in vain, or take half a year or more 😉The text was updated successfully, but these errors were encountered: