LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 08-06-2003, 09:18 PM   #1
klfreese
Member
 
Registered: Jul 2003
Distribution: Suse 10
Posts: 55

Rep: Reputation: 15
for loop


I am trying to write a shell script in bash that will back up, compress and restore critical system and user files. Im just a newbie trying to learn Linux.
 
Old 08-06-2003, 09:47 PM   #2
slapNUT
Member
 
Registered: Jun 2001
Location: Recycle Bin
Distribution: Linux & Everything else on VirtualBox
Posts: 144

Rep: Reputation: 15
You could try reading The Rute Users Tutorial

Heres the section on for loops.
 
Old 08-06-2003, 10:27 PM   #3
klfreese
Member
 
Registered: Jul 2003
Distribution: Suse 10
Posts: 55

Original Poster
Rep: Reputation: 15
Im sorry if my post brought tears to you eyes.
 
Old 08-07-2003, 01:43 AM   #4
cIx
Member
 
Registered: Aug 2003
Location: /dev/null
Posts: 40

Rep: Reputation: 15
you can also read this guide
http://www.tldp.org/LDP/abs/html/index.html
 
Old 08-07-2003, 08:49 PM   #5
klfreese
Member
 
Registered: Jul 2003
Distribution: Suse 10
Posts: 55

Original Poster
Rep: Reputation: 15
thanks for the help

I am not understanding this is why im asking. I have books and web resources just thought i could get some help.
 
Old 08-07-2003, 11:01 PM   #6
klfreese
Member
 
Registered: Jul 2003
Distribution: Suse 10
Posts: 55

Original Poster
Rep: Reputation: 15
Question Please

Ok I wanna use a for loop process the file by reading each line of the file. Use tar to collect these files
/tc/httpd/conf/*
/home/*
/etc/services
/etc/xinetd/*

pipe the files to gzip write the output file into the roots hone dir as backup where PID=ProcessID of the running shell script

send all the output from the script to a log file in the roots home dir. called backup.log

when that completes send the root an email listing the name of the backup file and report that the back up was complete.

and thats all

i am really stuck its not making sense to me


I really need the help!!!!!!!!
 
Old 08-08-2003, 03:15 AM   #7
DIYLinux
Member
 
Registered: Jul 2003
Location: NL
Distribution: My own
Posts: 92

Rep: Reputation: 18
Im still puzzled by what you want to do. Read what file, and what do you want to do with each line.

To do the actual backup, try

cd /
tar czf backup.tar.gz tc/httpd/conf home etc/services etc/xinetd/ 2> backup.log
 
Old 08-08-2003, 04:22 PM   #8
Electric_Blue
Member
 
Registered: Aug 2003
Distribution: RedHat
Posts: 35

Rep: Reputation: 15
I wrote a shell script that does this same thing, minus the info from the PID, but that is an interesting idea

Instead of cutting and pasting the whole thing here, I could give you a few ideas, and if that doesn't do the trick I'll stick the whole file up for download.

This one works with the .bz2 compression, which I found to be much more compressed (although more time consuming). you can do it in any compression you like of course.

#*******************************
#!/bin/bash
#*******************************
#
#day of month cannot have the padded ( 1) format
#it must be (01) format therefore the %m
#You can read about the options for "date" in the man pages
#and customize it for your app
#
#now call date variables for file naming
#******************************
cyear=`date +'%y'`
fyear=`date +'%Y'`
wday=`date +'%a'`
cdate=`date +'%d'`
month=`date +'%m'`
company=dba
sco=dba
dirpath=/mnt/nt/autobackup
backuppath=/mnt/central/Default
logpath=/var/log/hh
file=$company-$wday-$month-$cdate-$fyear.tar.bz2
#variables all set...now back up
#
echo Creating date file...
#this begins the log file building
#
cd $logpath/$sco
echo $backuppath/$company > datefile
echo This File created by automated Linux backup program running in cron >> datefile
echo -n "Time started: " >> datefile
date >> datefile
date
#
echo " *********Starting $sco Backup now*********** "
#
echo changing directories
cd $dirpath/$sco
echo Starting backup attempt on $backuppath/$company
tar -Icf $company-$wday-$month-$cdate-$fyear.tar.bz2 $backuppath/* |sleep 5
#
# Finish building the time stamp in the log file
echo -n "Time finished: " >> $logpath/$sco/datefile
date >> $logpath/$sco/datefile
# this ends log file building
# log files are overwritten every time it runs

# This last check goes to see if the file/folder we
# "think" we backed up is actually there
test -e $dirpath/$sco/$file
# I forget why this works...but it does!
new=$?

if [ "$new" = "0" ]
then
cat $logpath/$sco/datefile | /bin/mail -s "$sco backed up" you@your.com
echo Hooray $dirpath/$sco/$file Successfully Created!
else
echo Backup NOT completed: Go Fix it!
/bin/mail -s "Something is wrong with $sco" you@your.com
echo Exiting backup prgram for $sco
exit
fi
echo Exiting backup program for $sco.....
#####################

This will give you dated .tar.bz2 files in a directory.
Looks like:
dba-Fri-08-08-2003.tar.bz2

Hope this helps!
 
Old 08-08-2003, 05:06 PM   #9
klfreese
Member
 
Registered: Jul 2003
Distribution: Suse 10
Posts: 55

Original Poster
Rep: Reputation: 15
here is the senerio (Im very confused)
Ive only had 6 linux classes and the teacher is not that helpfull.

I need to write a script in bash called (rootbackup) that will back-up,compress, and restore critical system and user files.

I need to create a file in roots home dir called (critical-files) using VI, In this file, store the fully qualified path and file names of the files to be backed up. The files are

/etc/httpd/conf/*
/home/*
/etc/services
/etc/xinetd/*

Each feat ure should be on its own line

in the script, use a for-loop to process the file by reading each line of the file. Use the (1) tar or (1L) cpio command to collect the files. Pipe the files in to (1) gzip to compress the archive. Write the output file into the roots home dir as backup. {PID} where PID = process ID of the running shell script. Send all output from the shell script to a log file in roots home dir called (backup.log). When the process completes successfully, send root an email reporting that the backup was taken. In the email, list the name of the back-up file.

The script should take the name of the configuration file, (critical-files) as a command line parameter. If the file is not given at the command line, Display a help message.
 
Old 08-08-2003, 09:18 PM   #10
slapNUT
Member
 
Registered: Jun 2001
Location: Recycle Bin
Distribution: Linux & Everything else on VirtualBox
Posts: 144

Rep: Reputation: 15
Well here's a hint.

If you had a file called list that contained this:

/etc/httpd/conf/*
/home/*
/etc/services
/etc/xinetd/*

You could get the list into a string with cat, like this:
list=`cat list`
Now $list is the same as the file list.

Then a basic for loop would be:
for i in $list; do
#Now do whatever you want to do with the file or directory!
tar -czvf $i.tar $i
gzip $i
rm -f $i
mv $i $i.bak
done
 
Old 08-09-2003, 11:51 PM   #11
klfreese
Member
 
Registered: Jul 2003
Distribution: Suse 10
Posts: 55

Original Poster
Rep: Reputation: 15
Am i getting close ???
any suggestions??

#!/bin/bash
#rootbackup

#This program backs up critical files

[ "$#" -ne 1 ] && echo "Usage: rootbackup <input file list>" && exit 1
[ -f backup.[0-9]* ] && rm backup.[0-9]*
[ -f /tmp/backupfiles ] && cat /dev/null > /tmp/backupfiles

#
for i in `cat $1`
do
find $i depth -print >> /tmp/backupfiles
done
#
cat /tmp/backupfiles | cpio -o 2>backup.log | gzip >> /backup.$$
msg=`echo \`ls backup.[0-9]*\``
echo "back-up file name is $msg" > junk
mail root <junk
rm junk
 
Old 08-10-2003, 03:23 PM   #12
klfreese
Member
 
Registered: Jul 2003
Distribution: Suse 10
Posts: 55

Original Poster
Rep: Reputation: 15
here is the senerio (Im very confused)
Ive only had 6 linux classes and the teacher is not that helpfull.

I need to write a script in bash called (rootbackup) that will back-up,compress, and restore critical system and user files.

I need to create a file in roots home dir called (critical-files) using VI, In this file, store the fully qualified path and file names of the files to be backed up. The files are

/etc/httpd/conf/*
/home/*
/etc/services
/etc/xinetd/*

Each feat ure should be on its own line

in the script, use a for-loop to process the file by reading each line of the file. Use the (1) tar or (1L) cpio command to collect the files. Pipe the files in to (1) gzip to compress the archive. Write the output file into the roots home dir as backup. {PID} where PID = process ID of the running shell script. Send all output from the shell script to a log file in roots home dir called (backup.log). When the process completes successfully, send root an email reporting that the backup was taken. In the email, list the name of the back-up file.

The script should take the name of the configuration file, (critical-files) as a command line parameter. If the file is not given at the command line, Display a help message.


( post #11)

Am i getting close ???
any suggestions??

#!/bin/bash
#rootbackup

#This program backs up critical files

[ "$#" -ne 1 ] && echo "Usage: rootbackup <input file list>" && exit 1
[ -f backup.[0-9]* ] && rm backup.[0-9]*
[ -f /tmp/backupfiles ] && cat /dev/null > /tmp/backupfiles

#
for i in `cat $1`
do
find $i depth -print >> /tmp/backupfiles
done
#
cat /tmp/backupfiles | cpio -o 2>backup.log | gzip >> /backup.$$
msg=`echo \`ls backup.[0-9]*\``
echo "back-up file name is $msg" > junk
mail root <junk
rm junk
 
Old 08-10-2003, 06:56 PM   #13
klfreese
Member
 
Registered: Jul 2003
Distribution: Suse 10
Posts: 55

Original Poster
Rep: Reputation: 15
I really need some help here

Last edited by klfreese; 08-10-2003 at 11:11 PM.
 
Old 08-11-2003, 04:51 AM   #14
/bin/bash
Senior Member
 
Registered: Jul 2003
Location: Indiana
Distribution: Mandrake Slackware-current QNX4.25
Posts: 1,802

Rep: Reputation: 47
I'm trying to do this in cygwin bash shell, and sometimes there is stuff missing (like cpio). But it looks to me like it should work. Are you getting errors or what?
 
Old 08-11-2003, 10:16 AM   #15
Looking_Lost
Senior Member
 
Registered: Apr 2003
Location: Eire
Distribution: Slackware 12.0, OpenSuse 10.3
Posts: 1,120

Rep: Reputation: 45
You can use the cat command as suggested before to read in the file. The file can contain paths to invidual files or whole directories to be passed to tar which appears to work fine when used like below no matter whether it's passed a file or whole directory

files_to_be_archived=`cat $1`;

tar -cvzf backup.tar.gz $my_files &>backup_log;

tar/gzip seems to work fine without a for loop to read each line and
&>backup_log outputs all output from the tar command to backup_log
That's the way I'd basically do it, of course with added error check and all the other stuff..blah..blah
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
while loop or for loop? mijohnst Programming 18 11-21-2005 04:48 PM
No loop# SeaSharp Linux - Newbie 3 09-06-2005 08:14 PM
WHILE LOOP in 'C' ']['HeBroken Programming 4 10-29-2004 01:42 AM
while-loop Thomas23 Programming 4 05-24-2004 03:35 PM
help with the following C loop ..... purpleburple Programming 12 08-06-2002 10:32 PM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 12:55 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration