Every day I need to find files generated for the previous day in three different directories, copy them over to a staging directory, then bzip2 them.
Afterwards, I scp all those bzip2'ed files onto another machine.
I do this process at 5 minutes past midnight every weekday and it's getting tedious doing it for 21 machines. I'm very new to scripting (read: only looked at other people's scripts), so any help concerning how I'd go about scripting all of this would be very helpful.
I know that after it's done, adding it to the crontab file will be a breeze.
Here's basically the process that I use:
Quote:
[achi@air3 home]# ssh om01@om01
om01@om01's password:
[om01@om01 om01]$ find audit/*06-07.log -exec cp {} /home/om01/staging/ \;
[om01@om01 om01]$ find state/*060704.log -exec cp {} /home/om01/staging/ \;
[om01@om01 om01]$ find *0607* -exec cp {} /home/om01/staging/ \;
[om01@om01 om01]$ find *06-07* -exec cp {} /home/om01/staging/ \;
[om01@om01 om01]$ cp om01/state/*.ser /home/om01/staging
[om01@om01 om01]$ cp om01/state/positions /home/te07/staging
[om01@om01 om01]$ cd om01/staging
[om01@om01 staging]$ pwd
/home/om01/staging
[om01@om01 staging]$ bzip2 *.log
[om01@om01 staging]$scp *.bz2 chronos@beast:/u1/logfiles/2004/06-jun/7/om01/
|
I tried using mtime like so:
[om01@om01 home]$ find om01/audit -name *.log -mtime +1 -exec cp {} /home/om01/staging/ \;
I can't figure out what I'm doing wrong because it doesn't give me the right dates. I tried "-mtime -1" and it gives me two days behind.
Anyways, anyone who read this far, thanks for bearing with me! I appreciate any help