moving a file sequentially to different machines nightly
ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
moving a file sequentially to different machines nightly
I’m trying to think of a way to rsync one file each day to a different machine and before I started I was kind of wondering what some of you might do or suggest.
I have a Linux cluster with 43 nodes in it. Well about 2 weeks ago the hard drive on the head node broke beyond any rescue attempt. I did have a few week old backup, so I was okay but I was thinking there must be a better way to have a current backup with the resources that I already have.
What I’d like to do is have a nightly snapshot of my /home and *.conf files compressed into one file and then moved to a different node each night starting with node1, ending with node 43 and then starting over again continually. That would give me a 43 day window, which would be perfect.
I have a script that already runs weekly but I’m not sure how to tell the job to use the next node the next day and on and on. (i.e., node1, node2, node3, etc) How would some of you suggest I do this?
well if you do want 43 copies, then that's not going to fit over anythign nicely, so i'd say it's easiset to keep track of the value in a config file, just read the file at the start, and write the contents plus one back to it afterwards... What i'd lean towards doing instead though is using days of the month, and leaving 12 of those nodes out of the picture, that way you just automatically snd the backup to host$(date +%d) or something.
Thanks for the advice Acid. I don't mind having 43 copies of my backup is because each node has 350gig and I have my script clear out the directories on the remote node before copying in another job.
So are you kind of hinting at maybe making a daemon or something that reads/writes to a file to keep track of where it's at? Thanks for the quick response!
i don't mean that it's too many copies, just that if you use the day of the month, you'll have really simple access to a number between 1 and 31 which changes daily, it does the "hard" stuff for you... you have a cron job already i assume that runs every day? just tag in the right date command and instantly it goes to a different machien every day of the month.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.