Skip to content

Conversation

877dev
Copy link
Contributor

@877dev 877dev commented Nov 10, 2019

Hi,

Hopefully this is of some use, if you have a better solution I would be interested.

I have done the following:

  • Appended date stamp to backup file, YYYY-MM-DD so they appear in correct order.
  • Delete all backup files except the five most recent.

To do: Dropbox currently gets a copy of all files, look at how to remove older files if possible.

Appended date stamp to backup file, YYYY-MM-DD so they appear in correct order.
Delete all backup files except the five most recent.

To do:  Dropbox currently gets a copy of all files, look at how to remove older files if possible.
Appended date stamp to backup file, YYYY-MM-DD so they appear in correct order.
Delete all backup files except the five most recent.

To do:  Dropbox currently gets a copy of all files, look at how to remove older files if possible.
@gcgarner
Copy link
Owner

Thanks, Great work. I've pulled your modifications and started testing on my side.
I've just done a bit of tidying up, set the file name to a variable. added a logfile.

now just looking at how to remove the old backups from the cloud

@877dev
Copy link
Contributor Author

877dev commented Nov 11, 2019

Thank you, I know it's not much but I learned a few things.

I realised since that if date & time were assigned to a variable it would be simpler, and the script would be able to write numerous backups on the same day should that be required:
DATE=$(date +"%Y-%m-%d_%H%M")

There's a page here I was looking at for managing backups on the cloud side, I'm still trying to understand it all but it might be of use:
LINK UPDATED

Thanks :)

Added DATE variable, this allows timestamp of "YYYY-MM-DD-HHSS".
This means multiple backups per day can be stored.
@gcgarner
Copy link
Owner

I went one step further and did this

logfile=./backups/log.txt
backupfile="backup-$(date +"%Y-%m-%d").tar.gz"

and used it like this

sudo tar -czf \
        ./backups/$backupfile \
...

Can you just add a log file something along these line? (just with your variable):

echo "backup saved to ./backups/$backupfile"
touch $logfile
echo  $backupfile >>$logfile

Side note: if you use VS-code for shell scripts there is a really nice addon for formatting shell scritps (its based on shfmt written in go, i can send you the details)

Added $logfile and $backupfile variables as recommended
Script now adds latest backup filename to backups/log.txt
@877dev
Copy link
Contributor Author

877dev commented Nov 11, 2019

Hi there, thank you for the suggestions, I have committed my latest changes. I hope this process is not too slow for you, but I am enjoying the challenge :)

I could not get echo $backupfile >>$logfile to work for me : permission denied.
So I googled and settled on echo $backupfile | sudo tee $logfile -a which works fine in my limited tests. It adds the new backup filename to the log.txt each time.

Which raises a question I've had for a few days, why is the backups folder 'root' access? As I am having to use sudo in the script a few times, not sure if that is bad.

Not sure about the next step, I guess you are thinking about using the log.txt file with some command to delete the remaining files. Then mimick this command with the dropbox/google drive uploads.

PS - I've only been using Atom editor until now, I just installed VS-code so yes anything usefuil you have would be appreciated.

Thanks!!

@gcgarner
Copy link
Owner

The issue with the rights on the backup folder are related to the contents of the volumes folder. Docker creates some of them with root privileges and therefore the tar command needs to be run with sudo which in turn creates the tar.gz as root.

"Fortunately" Rasbian has been configured not to ask for a password for sudo ... that alone gives me the creeps but it does allow you execute scripts like this.

@gcgarner
Copy link
Owner

I think the reason you had issues with the logfile was because you had somehow created it as root. I've modified the script slightly to remove the additional sudos by doing a chown on backup*

Great work, now just to get the cloud backups trimmed

@gcgarner gcgarner merged commit ea18e3b into gcgarner:master Nov 11, 2019
Willem-Dekker pushed a commit to robertcsakany/IOTstack that referenced this pull request Jun 14, 2020
Fixed bug where yaml merging was overwriting deconz services
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants