LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Newbie (https://www.linuxquestions.org/questions/linux-newbie-8/)
-   -   Newbie Cronjob Question (https://www.linuxquestions.org/questions/linux-newbie-8/newbie-cronjob-question-4175576752/)

blktux 04-05-2016 02:02 PM

Newbie Cronjob Question
 
Hello all,

I am trying to archive log files from my RabbitMQ server. The current plan is to use logrotate to rotate logs daily or if they grow to over 100M and then compress them after they are five days old keeping them in the same directory.

I was able to setup logrotate without issue however as the plan progressed to sending those compressed log archives to another server, I was unable to find out how to use crontab to transfer those files. Currently the cronjob is set up for moving files every Friday at 8PM from the rabbitmq directory that are older than 5 days. My problem is, I don't know how to copy the log files that are older than 5 days to the new server. I have tried to search on Google for ways to use the find command and scp in a cronjob but I have been unsuccessful.

Here is the command I have so far:
0 20 * * 5 find /var/log/rabbitmq/ -type f -mtime +5

I am not sure how to finish the command to scp those files to the other server.

Any advice is appreciated.

Thanks,

suicidaleggroll 04-05-2016 02:21 PM

Your best bet is to write a script with all of the necessary commands in it, and then run that from cron.

blktux 04-05-2016 02:34 PM

Is scp in the script the way to go? I'm trying to figure out how to do that best and have considered rsync.

I'm not sure if it matters but I plan on doing something similar for other services such as tomcat and other applications. Am I wrong in thinking that will require a script per application?

suicidaleggroll 04-05-2016 02:43 PM

Either scp or rsync would work fine, they both work over ssh so just set up your ssh keys and you're golden.

Any 1-2 line jobs could be put straight in the crontab, much more than that and you're better off making a script.

JJJCR 04-05-2016 10:58 PM

Not tested, before rolling out to production test with a dummy data.

0 20 * * 5 find /var/log/rabbitmq/ -type f -mtime +5 | mv $(xargs) /archive_log_file/my_other_directory

Try it and let us know. Cheers!!!

MadeInGermany 04-06-2016 01:21 AM

Isn't the $(xargs) run before the pipe, and mv doesn't read from the pipe?
Also, mv has -t option so can be used with find -exec
Code:

0 20 * * 5 find /var/log/rabbitmq/ -type f -mtime +5 -exec mv -t /archive_log_file/my_other_directory {} +
But, wasn't the question about scp?

JJJCR 04-06-2016 04:11 AM

There are a lot of ways to do a task, actually I did test this command (0 20 * * 5 find /var/log/rabbitmq/ -type f -mtime +5 | mv $(xargs) /archive_log_file/my_other_directory) on a production server and it works fine.

But of course, I don't want to be responsible for any mistakes made if the command behave differently on another environment.

Test and test before applying the command to make sure it works as expected.

MadeInGermany 04-06-2016 05:41 AM

Pardon, I thought that the shell substitutes the $(xargs) before it creates the pipe.
Now I have tested it - and I was wrong: it works, at least in principle. (What happens if there are too many arguments?)

Still, for mv the -t option is preferrable (e.g. there are no problems with too many arguments or with special characters in file names).

blktux 04-06-2016 07:15 AM

Quote:

Originally Posted by JJJCR (Post 5526856)
Not tested, before rolling out to production test with a dummy data.

0 20 * * 5 find /var/log/rabbitmq/ -type f -mtime +5 | mv $(xargs) /archive_log_file/my_other_directory

Try it and let us know. Cheers!!!

I'm confident that would work however I don't think I can use the mv command to get the files to a different machine.

I have log rotate setup to rotate the file daily and compress (.gz) files older than 5 days. Below is the cron job I plan on testing to move the compressed files to a different server for analysis at a later time:
0 20 * * 5 rsync -av /var/log/rabbitmq/*.gz <remote_server>:/logs/rabbitmq

I also plan on having a cron on the same machine to delete the compressed log files once they have been rsync'd to the remote server:
0 20 * * 6 find /var/log/rabbitmq/ type -f -mtime +7 -name "*.gz" -delete

Thoughts?

MadeInGermany 04-06-2016 11:51 AM

One problem you can solve by means of one cron job:
Code:

0 20 * * 5 rsync -av /var/log/rabbitmq/*.gz <remote_server>:/logs/rabbitmq && find /var/log/rabbitmq/ type -f -mtime +7 -name "*.gz" -delete
I.e. run the delete job only after the transfer ended successfully.

MadeInGermany 04-06-2016 12:02 PM

Perhaps your rsync has the wonderfull --remove-source-files option, then you can solve all potential problems:
Code:

0 20 * * 5 rsync --remove-source-files -av /var/log/rabbitmq/*.gz <remote_server>:/logs/rabbitmq

hortageno 04-06-2016 12:48 PM

... and if you really want to archive these log files, you should rename them by adding a time stamp or similar to the filename. Otherwise you will overwrite your logfiles with the new version every day.

JJJCR 04-06-2016 09:25 PM

you can try this:

Quote:

tar cvzf - /path_to/local/folder | ssh <user_name>@<remote_server> "dd of=/path_on/remote/server/file_name_data.tar.gz"
But it you set it on cron you need to use ssh keys or sshpass to supply the password without user intervention.

blktux 04-07-2016 07:41 AM

Quote:

Originally Posted by hortageno (Post 5527169)
... and if you really want to archive these log files, you should rename them by adding a time stamp or similar to the filename. Otherwise you will overwrite your logfiles with the new version every day.

I realized this with my test. Unfortunately I am unable to get the size limit to work with logrotate so I am working on that before I am able to copy the log archives to another machine.


All times are GMT -5. The time now is 07:22 PM.