Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support lock-wait with --lock-retry in restic 0.16 #240

Merged

Conversation

jkellerer
Copy link
Collaborator

@jkellerer jkellerer commented Aug 5, 2023

This PR leverages the newly added restic 0.16 flag --lock-retry to offload some of the remote lock retry logic to restic (sequences of up to 10 minutes).

This should lower resource usage (no startup overhead) and allow faster retry cycles (managed by restic itself).

Had also relaxed the stale lock age (the time that a remote lock must not have been refreshed before resticprofile attempts to unlock). Restic allows ~10 minutes and maybe less if host is the same, resticprofile's new min is 15m and default 1h for automatic unlock.

Example:

➜ resticprofile --lock-wait=10m test-backup.check --read-data
2023/08/06 17:32:38 using configuration file: profiles.yaml
2023/08/06 17:32:38 profile 'test-backup': initializing repository (if not existing)
2023/08/06 17:32:39 profile 'test-backup': starting 'check'
using temporary cache in /tmp/restic-check-cache-1566484652
repository bb429813 opened (version 2, compression level auto)
created new cache in /tmp/restic-check-cache-1566484652
create exclusive lock for repository
repo already locked, waiting up to 8m0s for the lock
unable to create lock in backend: repository is already locked exclusively by PID 46959 on MYHOST by MYUSER (UID 502, GID 20)
lock was created at 2023-08-06 16:14:23 (1h26m16.339556s ago)
storage ID aeccdab0
the `unlock` command can be used to remove stale locks
2023/08/06 17:40:40 lock wait (remaining 1m59s / waited 8m1s / elapsed 8m1s): /resticprofile/test-repo locked by PID 46959 on MYHOST by MYUSER (UID 502, GID 20)
using temporary cache in /tmp/restic-check-cache-428519430
repository bb429813 opened (version 2, compression level auto)
created new cache in /tmp/restic-check-cache-428519430
create exclusive lock for repository
repo already locked, waiting up to 0s for the lock
unable to create lock in backend: repository is already locked exclusively by PID 46959 on MYHOST by MYUSER (UID 502, GID 20)
lock was created at 2023-08-06 16:14:23 (1h27m17.241516s ago)
storage ID aeccdab0
the `unlock` command can be used to remove stale locks
2023/08/06 17:41:41 restic: possible stale lock detected (lock age 1h27m17.242s >= 1h27m0s). Trying to unlock.
2023/08/06 17:41:41 profile 'test-backup': unlock stale locks
repository bb429813 opened (version 2, compression level auto)
successfully removed 1 locks
using temporary cache in /tmp/restic-check-cache-113952085
repository bb429813 opened (version 2, compression level auto)
created new cache in /tmp/restic-check-cache-113952085
create exclusive lock for repository
load indexes
...

@codecov
Copy link

codecov bot commented Aug 5, 2023

Codecov Report

Patch coverage: 72.73% and project coverage change: +0.07% 🎉

Comparison is base (6cc332d) 77.39% compared to head (1e96861) 77.46%.
Report is 3 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master     #240      +/-   ##
==========================================
+ Coverage   77.39%   77.46%   +0.07%     
==========================================
  Files          93       93              
  Lines       10034    10120      +86     
==========================================
+ Hits         7765     7839      +74     
- Misses       2004     2015      +11     
- Partials      265      266       +1     
Flag Coverage Δ
unittests 77.46% <72.73%> (+0.07%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Files Changed Coverage Δ
config/global.go 86.36% <ø> (ø)
wrapper.go 85.01% <70.83%> (-0.49%) ⬇️
shell/analyser.go 92.48% <77.78%> (-2.60%) ⬇️

... and 2 files with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@jkellerer jkellerer force-pushed the ft-use-lock-retry-in-restic16 branch from 4146b08 to bec6b32 Compare August 5, 2023 20:07
Also supporting custom overridden "lock-retry" values by getting
the time restic waited on the lock from restic itself
@jkellerer jkellerer force-pushed the ft-use-lock-retry-in-restic16 branch from bec6b32 to ebb5df9 Compare August 6, 2023 13:54
@jkellerer jkellerer marked this pull request as ready for review August 6, 2023 14:54
@jkellerer jkellerer force-pushed the ft-use-lock-retry-in-restic16 branch from 02ba3f3 to b4086c8 Compare August 6, 2023 15:07
@creativeprojects
Copy link
Owner

What is your configuration to activate the --lock-retry?

My brain is a bit foggy because of a cold, but still, I haven't managed to generate the --lock-retry flag

@jkellerer
Copy link
Collaborator Author

Get well soon!

My config was nothing more than this:

  profile:
    initialize: true
    repository: test-repo
    password-file: test-repo.key
    force-inactive-lock: true
➜ resticprofile --lock-wait=10m profile.check --read-data

(... and restic 0.16 in the path)

wrapper.go Outdated Show resolved Hide resolved
Copy link
Owner

@creativeprojects creativeprojects left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works all fine now 👍🏻

I agree the arguments layer is getting messy and needs a bit of refactoring

Thanks 😉

@creativeprojects creativeprojects added this to the v0.23.0 milestone Aug 9, 2023
@creativeprojects creativeprojects added the enhancement New feature or request label Aug 9, 2023
@jkellerer jkellerer merged commit 9c05157 into creativeprojects:master Aug 9, 2023
@jkellerer jkellerer deleted the ft-use-lock-retry-in-restic16 branch August 9, 2023 15:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants