Help I need help tarring multiple files in multiple directories
Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Help I need help tarring multiple files in multiple directories
I cant seem to figure out how to write a script which will tar all my bash files, in all my directories in and under HOME, but not the directories themselves and not this particluar script. How can I accomplish this??? Any help would be greatly appreciated.
[edit]
If your scripts don't end in .sh it'd be a bit more
complicated ... you'd have to skip the iname bit
in find, and -exec file on all files like:
find ~ -exec file {} \; | grep -i shell | cut -d : -f 1 > ~/back_us_up.1
[/edit]
I am new to programing but I thought they all had to start like
#!/bin/bash
I figured the tarring would go something like
for files in $HOME | ls -l
-tar zxf backups.tar.z
then listing all the subfolders by full path name? Maybe I,m simplifying this to much. I,m just struggling on how to do this and how to TAR everything but this script. this may be a dumb question but what it the bin/sh bash line??? I use the cygwin term for windows, This is my first programming class. Any further help for this newb would be greatly appreciated.
Originally posted by VisionZ
I am new to programing but I thought they all had to start like
#!/bin/bash
On most systems sh will be a symlink to
bash, but sh is "more portable" ... and if
you keep it simple like in my example it
will work with old bourne (and most likely
korn) shell as well ...
Quote:
I figured the tarring would go something like
for files in $HOME | ls -l
-tar zxf backups.tar.z
then listing all the subfolders by full path name?
You could do that, but I think my approach
is more elegant, and probably faster ;)
When i tried it, I get thte follwing errors
(archiver = my script name)
./archiver: cannot create ~/backup.1: directory nonexistent
repeats it twice
tar: you must specify on of the '-Acdtrux' options
Try 'tar --help' for more info
I,m using the bash shell, and like I said I am new to programming, some of your code made sense to me, some of the stuff I was totally lost looking at =), I am assuming the in your code where you put grep -v <myscriptname> this is where I put my script which is called archiver so it will not get tarred??? I truly do appreciate the help as this particular problem is giving me difficulty trying to figure out. Also trying to write a script which will delete my temp internet files but thats a different story.
Maybe something along these lines - although it keeps the directory structure:
Code:
#!/bin/sh
SCRIPTNAME=$HOME/archiver
[ x`echo $1` = x ] && ARCHIVE=$HOME/scripts.tar
find /usr/local/bin -type f | while read FILE ; do
BASHSCRIPT=`file "$FILE" | grep Bourne | grep -c "shell script text executable"`
if [ $BASHSCRIPT -eq 1 -a "$FILE" != "$SCRIPTNAME" ] ; then
echo "Archiving $FILE in $ARCHIVE"
tar rf $ARCHIVE $FILE
fi
done
This uses the file command to find the (ba)sh scripts.
After trying your script, it ran, but nothing happened, or I didnt see it. All my scripts which I wrote are still their, and I dont see a tar archive anywhere? I am running cygwin for windows, would this cause me any problems??
Well, your scripts should be left where they are, as tar only makes a tar archive and doesn't remove anything. However, I find it strange that you can't find the ~/scripts.tar archive... I don't know what cygwin does, so I can't help you with that... Can you post the output here? And your command?
I used the exact code that you had given me above and after I ran the script, it just jumps to the next prompt in my terminal. No output in the terminal at all no echoing
Code:
echo "archiving $file in $archive"
The way my directories are set up is cygwin in under my C: drive. then it goes something like this.
cygwin/home/owner(which is my ~ directory) /emacs-21.3/bin(where my scripts are saved before I move them into owner)
I tried changing the structure in the script to something like /~/home/owner/
but that didnt work either
I appreciate the patience, if I,m not making any sense just let me know, like i said before i,m still learning.
HOLY CRAP, I dont know what happened but heres the error message I received;
cur: not found
tar (child): ~/myscripts.tar.gz: Canot open: No such file or directory
tar (child): Error is not recoverable: exiting now
find: file terminated by signal 13 (this never stops and keeps going and going and going)
until I hit Ctrl-C
everything is spot on am I doing something wrong?? Could Cygwin for windows be causing this???
grep: ~/back_us_up.1: no such file or directory
tar: ~/back_us_up.txt: cannot open: no such file or directory
tar: error is not recoverable: exiting now
If I may ask what is this txt file??? I use emacs as a text edit not word pad.
also what is back_us_up.1 i was wondering what the .1 stood for??
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.