Backing up file based on it's timestamp?
A piece of software generates daily backups named after date/time generated (e.g. daily-backup-2007_07_31.zip).
The thing I need to do is upload the latest backup to a remote server. Now, I need to get the filename to backup. My idea is to get the value from a "ls -lht" parsed with awk|sed. Could there probably be a more elegant solution? TIA. |
Assuming there's only one backup per day, you could use find to search for files modified within the last 24 hours and use the -exec flag to copy the file over. It should be a one-liner...
|
i'm sure this is not less elegant... but why not:
Code:
for i in $(ls -tr); do sleep 0; done; cp $i /new/location; |
ls -t|head -1| do_file_txfr
BTW, sleep 0 still starts the sleep process for each file it finds, that's (potentially) a fair bit of overhead and unnecessary. |
If you have to do this regularly (e.g. on a daily basis, as the backups are being created) I suggest to build the filename you expect to find and look for it on the local server. For example, if you want to upload the last created backup one day before the current day you can easily do
Code:
filename=daily-backup-$(date -d yesterday +%Y_%m_%d).zip - retrieve the backup of the previous day if the last one has not been crated, yet - do e-mail notification about something going wrong and so on... |
Guys, thanks for your replies.
Actually I've ended up with Code:
ls -lhtr /var/opt/confluence/backups | tail -n -1 | cut -d" " -f8 |
All times are GMT -5. The time now is 12:59 AM. |