Newbie Cronjob Question
Hello all,
I am trying to archive log files from my RabbitMQ server. The current plan is to use logrotate to rotate logs daily or if they grow to over 100M and then compress them after they are five days old keeping them in the same directory. I was able to setup logrotate without issue however as the plan progressed to sending those compressed log archives to another server, I was unable to find out how to use crontab to transfer those files. Currently the cronjob is set up for moving files every Friday at 8PM from the rabbitmq directory that are older than 5 days. My problem is, I don't know how to copy the log files that are older than 5 days to the new server. I have tried to search on Google for ways to use the find command and scp in a cronjob but I have been unsuccessful. Here is the command I have so far: 0 20 * * 5 find /var/log/rabbitmq/ -type f -mtime +5 I am not sure how to finish the command to scp those files to the other server. Any advice is appreciated. Thanks, |
Your best bet is to write a script with all of the necessary commands in it, and then run that from cron.
|
Is scp in the script the way to go? I'm trying to figure out how to do that best and have considered rsync.
I'm not sure if it matters but I plan on doing something similar for other services such as tomcat and other applications. Am I wrong in thinking that will require a script per application? |
Either scp or rsync would work fine, they both work over ssh so just set up your ssh keys and you're golden.
Any 1-2 line jobs could be put straight in the crontab, much more than that and you're better off making a script. |
Not tested, before rolling out to production test with a dummy data.
0 20 * * 5 find /var/log/rabbitmq/ -type f -mtime +5 | mv $(xargs) /archive_log_file/my_other_directory Try it and let us know. Cheers!!! |
Isn't the $(xargs) run before the pipe, and mv doesn't read from the pipe?
Also, mv has -t option so can be used with find -exec Code:
0 20 * * 5 find /var/log/rabbitmq/ -type f -mtime +5 -exec mv -t /archive_log_file/my_other_directory {} + |
There are a lot of ways to do a task, actually I did test this command (0 20 * * 5 find /var/log/rabbitmq/ -type f -mtime +5 | mv $(xargs) /archive_log_file/my_other_directory) on a production server and it works fine.
But of course, I don't want to be responsible for any mistakes made if the command behave differently on another environment. Test and test before applying the command to make sure it works as expected. |
Pardon, I thought that the shell substitutes the $(xargs) before it creates the pipe.
Now I have tested it - and I was wrong: it works, at least in principle. (What happens if there are too many arguments?) Still, for mv the -t option is preferrable (e.g. there are no problems with too many arguments or with special characters in file names). |
Quote:
I have log rotate setup to rotate the file daily and compress (.gz) files older than 5 days. Below is the cron job I plan on testing to move the compressed files to a different server for analysis at a later time: 0 20 * * 5 rsync -av /var/log/rabbitmq/*.gz <remote_server>:/logs/rabbitmq I also plan on having a cron on the same machine to delete the compressed log files once they have been rsync'd to the remote server: 0 20 * * 6 find /var/log/rabbitmq/ type -f -mtime +7 -name "*.gz" -delete Thoughts? |
One problem you can solve by means of one cron job:
Code:
0 20 * * 5 rsync -av /var/log/rabbitmq/*.gz <remote_server>:/logs/rabbitmq && find /var/log/rabbitmq/ type -f -mtime +7 -name "*.gz" -delete |
Perhaps your rsync has the wonderfull --remove-source-files option, then you can solve all potential problems:
Code:
0 20 * * 5 rsync --remove-source-files -av /var/log/rabbitmq/*.gz <remote_server>:/logs/rabbitmq |
... and if you really want to archive these log files, you should rename them by adding a time stamp or similar to the filename. Otherwise you will overwrite your logfiles with the new version every day.
|
you can try this:
Quote:
|
Quote:
|
All times are GMT -5. The time now is 07:22 PM. |