Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Now i want to do command in /home/user to recurse extract all rar archives up that directory.
I have figured it so far: rar e -r -inul *.rar
... but ...
i would like that extracted files stay in their respective directories. For exm. files extracted from /home/user/archive2/CD1/ should be extracted in same directory. And now with this cmd all files are extracted to /home/user
And i also couldnt figure out how could rar or unrar delete *.rar files upon sucesfull extraction, and maybe also delete the .sfv file
Distribution: Debian Etch (w/ dual-boot XP for gaming)
Posts: 282
Rep:
Sounds like the kind of thing that rar won't be able to support on its own - but a shell script should have no problem doing it. What you essentially want to do is run the 'rar e -inul *.rar' command in every directory below (and including) the current one. My first thought was to get the shell to list all these subdirectories, and run the command in each one. However, it turns out it's easier to just list every rar file below the current and perform the task directly on that.
The shell for loop performs a task on each of the lines of output in turn - so if we can generate output that lists all the files that you wish to unrar, you can simply run the following script to do what you want:
Code:
for f in <command to list the files> do
rar e -inul $f
done
I leave the working out of <command to list the files> as an exercise. Hint: man find.
The directory can contain split .rar files (.r00 .r01 .r02 ... .rar), so that what u suggested to do would unpack same thing 50 times if there would be 50 rar files in that director which would contain same thing (split rar files).
So the first thing which u suggested which wasnt so easy as u mentioned would be better
Distribution: Debian Etch (w/ dual-boot XP for gaming)
Posts: 282
Rep:
Not necessarily... if you make the output only list *.rar files you'll be OK!
Although with very minor modifications you can get a list of all the directories and apply your original command to them - look at the '-type' option to find...
for f in `find /home/hdd1/download/ -wholename *.rar`
do
rar e -inul $f
done
it extracts files to the directory where i ran my shell script and not in the directory where .rar files were. So i get same as if i would use -r by rar.
Distribution: Debian Etch (w/ dual-boot XP for gaming)
Posts: 282
Rep:
Ah yes, of course. In which case there are two approaches - tell the rar command where to extract the files (into the relevant directories), or change into each directory recursively and run the command as written.
Firstly, we'd have to get the subdirectory where the file returned was stored. The dirname command can do this simply and effectively, and it should alse be simple to append the directory to the rar command to tell it to extract there.
Recursion is trickier, and thus more interesting. Ignoring the question of whether to run it depth-first or breadth-first (since it probably doesn't matter in this case) we have the simple task of unrarring all files in the current directory, then the more complex task of running the script recursively in all subdirectories. More importantly, we only want to run in each directory once (again, we could get away with it here, but it's wasteful), so we can only run the script in each immediate subdirectory. If we ran it in all of them then a directory ./foo/bar, for example, would be processed both by the script running in . and the one running in ./foo . Bear in mind the -maxdepth option to 'find'.
The first option is definitely more practical, but the second is more elegant, if you have the time and curiousity to get it working. It's also more of a framework, since the steps involved are the same as running any command of this type recursively.
for f in `find $1 -wholename *.rar`
do
for g in `dirname $f`
do
cd $g
echo "Working in directory: $g"
`rar e -r -inul *.rar`
done
done
but it doesnt work. it does nothing, where did i go wrong ? And if i make the commands separetly i get what i want for exm. the find or dirname i get what i expect but as whole it doesn do nothing :S
Distribution: Debian Etch (w/ dual-boot XP for gaming)
Posts: 282
Rep:
For the first option, I'd have gone for something like:
Code:
for f in `find . -wholename *.rar`
do
rar e -inul $f `dirname $f`
done
I can't guarantee that works, since I don't have a man page for rar, but I'd imagine you can specify the location to extract to after the file to extract, if it's anything like other similar commands (e.g. tar).
As for the recursion... I'd have done it simply by recursing over the directories and extracting just the rar files in each directory each time. Probably the trickiest thing about this is that you can't use relative paths for the directories (since we'll be cd-ing all over the place), so you need to construct the absolute path by saving the working directory on entry into each method. That sounds a bit complicated... it might make more sense in the script:
Code:
# Assume this file is called recurse.sh, and that it's in the path:
# We'll save the current directory, so we can assemble absolute paths
currdir=`pwd`
# We first extract all the *.rar files in this directory
echo "Working in directory: $currdir"
rar -e -inul *.rar
# Now we recurse
for dir in `find . -maxdepth 1 -type d`
do
cd $currdir/`basename $dir`
recurse.sh
done
Note how basename returns just the name of the directory, which is appended to the working directory when the script was run. Each time this script is run, the currdir variable is different and specific to that script, so we can effectively work through the directory structure.
Oh, and in case there's any confusion, either of these scripts should acheive the desired result; they demonstrate different approaches to the problem.
Note: Unfortunately I can't test these scripts out at the moment, so there might be some syntax errors, etc., in them. They should give you an idea of what I had in mind, though.
#!/bin/bash
if [ $# -ne 1 ]
then
echo "U forgot to enter directory where i should work!"
exit
fi
while
mesg="\n==============================================\n
1.. Check .sfv files.\n
2.. Unrar all files.\n
3.. Delete rar and sfv files.\n
4.. Exit
\n==============================================\n
Select: \c"
do
echo -e $mesg
read selection
case $selection in
1)
cd $1
cfv -r ;;
2)
for f in `find $1 -wholename *.r01`
do
echo "Unpacking in directory: "`dirname $f`
rar e -inul $f `dirname $f`
done ;;
3)
for g in `find $1 -wholename *.r01`
do
cd `dirname $g`
echo "Deleting in directory: "`dirname $g`
rm *.r?? *.url *.sfv imdb.nfo
rm -r Sample/
done ;;
4)
exit;;
esac
done
Thanx for all the help. I have developed it to this Now i got the basics of bash, and i think i can do alot of usefull things. And this i wrote works flawless. Maybe there can be done some improvment but i think this is quite ok.
Here's my crack at it, while the previous ones were great they missed one thing, files w/ space in name and also the new file name structure for some archives, they're all .*\.rar$ rather than .*\.rar$ -> .*\.r[0-9]+$, so the scripts were trying to extracting all the archive files in the same archive rather than just the first one in the entire split. Could this be optimized further?
Code:
#!/bin/sh
### ###
# filename reunrar #
### ###
if [ -n "$1" ]; then
find $1 -iname *.rar | while read f
do
file=`basename "$f"`
dir=`dirname "$f"`
if [[ $file =~ .*part0*1\.rar$ ]]; then
echo "$file to $dir"
nice -n 19 unrar e -o- -inul "$f" "$dir"
elif ! [[ $file =~ .*part[0-9]+\.rar$ ]]; then
echo "$file to $dir"
nice -n 19 unrar e -o- -inul "$f" "$dir"
fi
done
else
echo "reunrar [FILES DIR]"
fi
Last edited by mikelidman; 07-20-2011 at 12:48 AM.
mikelidman thanks works great. Couple things delete rar files from each dir as you go. Also does it work e.g. Alex, Robert.part1.rar, Alex, Robert.part2.rar files in dirs? Maybe put all extracted files into seperate folder.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.