-
Notifications
You must be signed in to change notification settings - Fork 578
Appended date to backup file and keeps only the last five backups #62
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Appended date stamp to backup file, YYYY-MM-DD so they appear in correct order. Delete all backup files except the five most recent. To do: Dropbox currently gets a copy of all files, look at how to remove older files if possible.
Appended date stamp to backup file, YYYY-MM-DD so they appear in correct order. Delete all backup files except the five most recent. To do: Dropbox currently gets a copy of all files, look at how to remove older files if possible.
Thanks, Great work. I've pulled your modifications and started testing on my side. now just looking at how to remove the old backups from the cloud |
Thank you, I know it's not much but I learned a few things. I realised since that if date & time were assigned to a variable it would be simpler, and the script would be able to write numerous backups on the same day should that be required: There's a page here I was looking at for managing backups on the cloud side, I'm still trying to understand it all but it might be of use: Thanks :) |
Added DATE variable, this allows timestamp of "YYYY-MM-DD-HHSS". This means multiple backups per day can be stored.
I went one step further and did this
and used it like this
Can you just add a log file something along these line? (just with your variable):
Side note: if you use VS-code for shell scripts there is a really nice addon for formatting shell scritps (its based on shfmt written in go, i can send you the details) |
Added $logfile and $backupfile variables as recommended Script now adds latest backup filename to backups/log.txt
Hi there, thank you for the suggestions, I have committed my latest changes. I hope this process is not too slow for you, but I am enjoying the challenge :) I could not get Which raises a question I've had for a few days, why is the backups folder 'root' access? As I am having to use sudo in the script a few times, not sure if that is bad. Not sure about the next step, I guess you are thinking about using the log.txt file with some command to delete the remaining files. Then mimick this command with the dropbox/google drive uploads. PS - I've only been using Atom editor until now, I just installed VS-code so yes anything usefuil you have would be appreciated. Thanks!! |
The issue with the rights on the backup folder are related to the contents of the volumes folder. Docker creates some of them with root privileges and therefore the tar command needs to be run with sudo which in turn creates the tar.gz as root. "Fortunately" Rasbian has been configured not to ask for a password for sudo ... that alone gives me the creeps but it does allow you execute scripts like this. |
I think the reason you had issues with the logfile was because you had somehow created it as root. I've modified the script slightly to remove the additional sudos by doing a chown on backup* Great work, now just to get the cloud backups trimmed |
Fixed bug where yaml merging was overwriting deconz services
Hi,
Hopefully this is of some use, if you have a better solution I would be interested.
I have done the following:
To do: Dropbox currently gets a copy of all files, look at how to remove older files if possible.