LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Newbie (https://www.linuxquestions.org/questions/linux-newbie-8/)
-   -   script need (https://www.linuxquestions.org/questions/linux-newbie-8/script-need-903151/)

saran_redhat 09-15-2011 07:00 AM

script need
 
Hi,

Can anyone tel me how to make folder to create tar.gz script with date
For example : I have one foder in my server path /home/data/. In side this folder I have lots of folder. Now I want to create each folder in side /home/data to make tar.gz format and stored some other path like /root/data with date using linux script.
For backup purpose . my English very bad please.
Thanks

resolv_25 09-15-2011 08:20 AM

Here is an example (if I understand you properly).
1. Open text editor and create script, for example, save as:mybackup.sh (inside your /home/data/)
2. Put this inside

Code:

#!/bin/sh

# Make data and directory in dd-mm-yyyy format
NOW="$(date +"%d-%m-%Y")"
# create dir wtih a today's date:
mkdir /home/data/$NOW

# create archive of /root/data as backup_date.tar.gz within a newly created directory of today
tar cvzf /home/data/$NOW/backup_$NOW.tar.gz /root/data

3. If this has to backup root directory, make root owner of the script, within a terminal write as root:
Code:

$ chown -R root:root mybackup.sh
4. Make it executable:
Code:

$ chmod +x mybackup.sh
5. Run it manually (as root):
Code:

$ ./mybackup.sh
Or put it in cron (as root) to be executed regularly, let's say every day, open editor as root:
Code:

$ crontab -e
Let's say every day in 23 hour 10 minutes, paste into opened editor for cron:

Code:

10 23 * * * /home/data/mybackup.sh
(explanation of this line:
10 (minutes) 23 (hours) * (every day) * (every week) * (every month)
it may be:
minute (0-59), hour (0-23, 0 = midnight), day (1-31), month (1-12), weekday (0-6, 0 = Sunday), command

saran_redhat 09-15-2011 08:52 AM

Quote:

Originally Posted by resolv_25 (Post 4472417)
Here is an example (if I understand you properly).
1. Open text editor and create script, for example, save as:mybackup.sh (inside your /home/data/)
2. Put this inside

Code:

#!/bin/sh

# Make data and directory in dd-mm-yyyy format
NOW="$(date +"%d-%m-%Y")"
# create dir wtih a today's date:
mkdir /home/data/$NOW

# create archive of /root/data as backup_date.tar.gz within a newly created directory of today
tar cvzf /home/data/$NOW/backup_$NOW.tar.gz /root/data

3. If this has to backup root directory, make root owner of the script, within a terminal write as root:
Code:

$ chown -R root:root mybackup.sh
4. Make it executable:
Code:

$ chmod +x mybackup.sh
5. Run it manually (as root):
Code:

$ ./mybackup.sh
Or put it in cron (as root) to be executed regularly, let's say every day, open editor as root:
Code:

$ crontab -e
Let's say every day in 23 hour 10 minutes, paste into opened editor for cron:

Code:

10 23 * * * /home/data/mybackup.sh
(explanation of this line:
10 (minutes) 23 (hours) * (every day) * (every week) * (every month)
it may be:
minute (0-59), hour (0-23, 0 = midnight), day (1-31), month (1-12), weekday (0-6, 0 = Sunday), command


Hi

Thanks for the reply.

My need is lot of websites directory contains on the directory like /home/sites/. so inside this path lot of websites directorys stored. now I want to make that each directory backup with samename to any other path.like backup folder makes with tar.gz format. this is i want. i think u understand. Thanks once again

tronayne 09-15-2011 09:20 AM

If I understand what you're trying to -- in /home/data you have sub-directories, say
Code:

/home/data
/home/data/dir01
/home/data/dir02
...
/home/data/dir99

You want to make individual archives of dir01, dir02, ..., dir99 so that you can extract them in sub-directories of /root/data, so you would have
Code:

/root/data
/root/data/yyyy-mm-dd-dir01.tar.gz
/root/data/yyyy-mm-dd-dir02.tar.gz
...
/root/data/yyyy-mm-dd-dir99.tar.gz

You could do this manually (where yyyy-mm-dd is the year, month and day; e.g., 2011-09-15)
Code:

cd /home/data
tar cf yyyy-mm-dd.tar *
<wait a while>
gzip yyyy-mm-dd.tar                        < compress the archive with gzip >

That archive would contain the entire tree (all the directories and files found in /home/data); it would not have the path name /home/data, only dir01, dir02, ..., dir99, so it could be extracted in another directory, /root/data in this case.

That would be the simplest way; you will have one archive file containing every sub-directory. Keep in mind that you can extract only what you want from a tar archive; i.e., you could extract dir01 from a complete archive as above without extracting anything else.

A little more complicated way is to archive the individual directories semi-manually
Code:

#!/bin/sh
#    we need a date stamp of the form yyyy-mm-dd
DATESTAMP=`date +%F`
#    get into the parent directory
cd /home/data
#    for each directory on this list...
for DIRNAME in dir01 dir02 dir03
do
    #    create the tar archive
    tar cf ${DATESTAMP}-${DIRNAME}.tar
    #    compress it
    gzip ${DATESTAMP}-${DIRNAME}.tar
    #    move it to the backup directory
    mv ${DATESTAMP}-${DIRNAME}.tar.gz /root/data
done

Now let's get a little more sophisticated (not too sophisticated, just a little).

We'd really like to do this without having to manually list the individual directories, so let's use the ls utility to list them and a little shell magic to get only what we need
Code:

#!/bin/sh
#      we need a time stamp for the archive(s) yyyy-mm-dd
TIMESTAMP=`date +%F`
cd /home/data
#      we'll use the ls utility to get a list (including files)
for DIRNAME in `ls`
do
        #      if it's a directory...
        if [ -d ${DIRNAME} ]
        then
                #      create the tar archive
                tar cf ${TIMESTAMP}-{${DIRNAME}.tar ${DIRNAME}
                #      compress the archive with gzip
                gzip ${TIMESTAMP}-{${DIRNAME}.tar
                #      move the compressed archive to /root/data
                mv ${TIMESTAMP}-{${DIRNAME}.tar.gz /root/data
        fi
done

That's it.

I would recommend that you use the first method -- you'll only have one dated archive to deal with and you'll be able to extract all of it or only parts that you want. The other methods give you a large number of archive files to deal with and that may not be desirable as time goes on.

Hope this helps some.

resolv_25 09-15-2011 02:37 PM

Yes, good scripts here.

Another way to accomplish the same is using some tool as backup-manager
After installation, there is a config file, /etc/backup-manager.conf (if I remember correctly)
and you can choose easily which directory to backup and time to live (how many days/weeks will this backup exist).
Good way the save the storage place, otherwise, in a couple of weeks or months you may have a large data.
Same tools may do backup of database, as MySQL, also, you may pipe some command if needed etc.
Backup manager is run on desirable time, again as cron job.
Also, it may copy data on another server via ssh or ftp.
Very easy to customize, I think it could be suitable for this purpose.


All times are GMT -5. The time now is 12:32 AM.