Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place! |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
 |
09-18-2012, 04:55 AM
|
#1
|
Member
Registered: Jun 2012
Posts: 122
Rep: 
|
How to compress and move log older than 30 days from one folder to another
Hi
How can i compress and move the logs from one folder to another
|
|
|
09-18-2012, 05:17 AM
|
#2
|
LQ Newbie
Registered: Sep 2008
Posts: 4
Rep:
|
Find command, or logrotate utility
That is easy,
Google for find command examples on searching for files older then 30 days:
find /storage/current/dbdumps/ -type f -mtime +30 -print | xargs -I {} mv {} /storage/archive/dbdumps
or logrotate (build-in Linux utility) which does all of it for you including clean-up :-)
|
|
1 members found this post helpful.
|
09-19-2012, 11:33 PM
|
#3
|
Member
Registered: Jun 2012
Posts: 122
Original Poster
Rep: 
|
These command find and move the complete log .But im looking for a command to compress and move the logs from sorce - destination in tar.gz format
find /storage/current/dbdumps/*.logs -type f -mtime +30 -print | xargs -I {} mv {} /storage/archive/dbdumps/*.logs
|
|
1 members found this post helpful.
|
09-19-2012, 11:49 PM
|
#4
|
Senior Member
Registered: May 2010
Location: Palm Island
Distribution: RHEL, CentOS, Debian, Oracle Solaris 10
Posts: 1,420
|
Hi LittleMaster,
Here is a script I use to take the backup of my servers' logs monthly. You can take a look at this script and can configure according to your need:
Code:
#!/bin/bash
ssh <dest. server ip> 'mkdir /logs/Linux_Server_Logs/proxy_logs/$(date -d yesterday +%b-%Y)'
#### No need for exit since the SSH session terminates after the command completion ####
rsync -avze ssh /var/log/* root@<dest. server ip>:/logs/Linux_Server_Logs/proxy_logs/$(date -d yesterday +%b-%Y)/
#### cd /root/ eliminated and put in the tar command ####
#### Two commands combined into one so they execute one after an other if the first one is successful. ####
ssh <dest. server ip> 'cd /logs/Linux_Server_Logs/proxy_logs && tar -zvcf $(date -d yesterday +%b-%Y).tar.gz $(date -d yesterday +%b-%Y) && rm -rf $(date -d yesterday +%b-%Y)'
This script first login to destination server ip and create a directory of previous month's date name.
After that rsync command copies the mentioned logs from source to destination path.
Then client again ssh to server ip to compress the logs directory.
Good Luck! 
|
|
|
09-19-2012, 11:59 PM
|
#5
|
Member
Registered: Jun 2012
Posts: 122
Original Poster
Rep: 
|
Hi
In your script you have been backuping up the data to remote server. Im my server filesystem /data utilised 100 % of its utility and i have root partition with just 5 % utilised .So im trying out to find the log of /data and compress in tar .gz format to /root filesystem
find /data/*.logs -type f -mtime +30 -print | xargs -I {} mv {} -exec .tar.gz /root/logbkup/$date.tar.gz
Im trying out the above command But it never works.thanks for sharing your script really it help when i could take the log backup to remote destination server
|
|
|
09-20-2012, 12:16 AM
|
#6
|
Member
Registered: Aug 2011
Location: Chennai,India
Distribution: Redhat,Centos,Ubuntu,Dedian
Posts: 558
Rep: 
|
Hi
Little Master these the script i have been using to backup my logs to remote Nas server. I had just modfied to your need
#!/bin/bash
#Script -Purpose:Backup script
# Exits with zero if no error.
# Step -1 Function to archieve file using date and time
date=`/bin/date "+%Y.%m.%d.%H.%M.%S"`
# Step -2 Function to create folder on /root/logs with Timestamp
mkdir -p /root/logs/$date
# Step -3 Function to find & move logs from /data to /root/logs
find /data/ -type f -iname *.logs -mtime +30 -print | xargs -I {} mv {} /root/logs/$date
#Step -4 Function to compress the log Backup Content
tar -cvzf /root/log/$date /root/logarchive/$date.tar.gz
#Step -5 Function to remove the Log
rm -rf /root/log/*
#Step -4 Function to Print the logs Backup status
echo "$(date) log Backuped successfully ">>/root/logs/logbackup-status-$date.log
#Step -5 Function to call the Mail trigger to user
mailid= $@gmail.com
#Step -5 Function to trigger mail after log backup status
mail -s '$(date) Logs Moved Successfully $(hostname) - Successful' $mailid
Last edited by jsaravana87; 09-20-2012 at 12:36 AM.
|
|
|
09-20-2012, 12:30 AM
|
#7
|
Senior Member
Registered: May 2010
Location: Palm Island
Distribution: RHEL, CentOS, Debian, Oracle Solaris 10
Posts: 1,420
|
LittleMaster,
The script given by arun is the correct one. You can try out this.
|
|
|
All times are GMT -5. The time now is 08:45 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|