Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
The command to move is mv(move). The command to copy is cp(copy). The command for recursive copying is recursive(-R or -r). The command to delete is rm(remove). The command to delete all subdirectories is force(-f).
Originally posted by Thoreau The command to move is mv(move). The command to copy is cp(copy). The command for recursive copying is recursive(-R or -r). The command to delete is rm(remove). The command to delete all subdirectories is force(-f).
Thanks, but i already new that much.... you missed the bit about "selective"
say for example i have 10,000 files, and only 1000 end in .txt, how do i *selectively* move those files that end in txt?
if [ $# -ne 3 ] ; then
echo " $0 <sourcedir> <targetdir> <condition>"
echo " Make sure to put the condition into apostrophes to avoid them"
echo " being interpreted by your shell"
echo " $0 /usr/src/foo /usr/src/bar 'baz*'"
if [ ! -d "$source" ] ; then
echo "ERROR: $source is not a directory!"
if [ ! -d "$target" ] ; then
echo "ERROR: $target is not a directory!"
Well, close but no cigar so far. I had to test xcopy to see that the 'globbing' (incorrectly from a logic standpoint) did as the original post suggested. The delete command in windows also does the logically fallacious same thing.
To selectively copy, recursively in unix, there is NOT a single hybrid command that accomplishes such. BUT, the beauty of unix is that the toolkit is robust. Break the job into smaller parts.
First, identify the files that need copying.
Then select those files (that is what 'globbing' short-cuts - combining these two issues.)
Then, MOVE them, rather than copying, so that no subsequent 'del' is required.
HOWEVER: The most robust manner to copy files from location A to location B in Unix is the old standby 'tar'.
The most robust manner of identifying files, recursively, is find.
The easiest manner of simulating globbing (ie. wildcard matching) is to use pattern matching - eg. patterns in grep/egrep/fgrep.
(Pattern matching is far more powerful than globbing.)
Avoid globbing with 'thousands of files' and 'hundreds of directories'. Your shell can go belly up. Csh used to have a limit of around 500,000 bytes being matched by globbing, bsh was well over 1,000,000 bytes. Always a difficult limitation to identify, unless one is familiar with it. Not to mention, that 'globbing' that much is extremely slow. Base level unix tools are much more efficient, and tool pattern matching is much more efficient, and flexible than shell globbing.
find . > /tmp/t0
egrep '\.mp3$' /tmp/t0 > /tmp/t1
tar -cf `cat /tmp/t1` | ( cd d:/newdirectory ; tar -xf - )
rm -i /tmp/t0 /tmp/t1 # clean-up one's cruft
# Finished - note - the above is 'overkill' - and ONLY covers the 'xcopy' equiv.
# The detail was provided above so that the following equivs might make sense, as to method.
On one line - the 'xcopy' equiv:
musicfiend> cd c:/mp3 ; tar -cf - `find . | grep '\.[Mm][Pp]3$'` | ( cd d:/newdirectory ; tar -xf - )
On one line - the 'del' equiv (complete with verification of every single file to be removed *s* )
AVOID LIKE THE PLAGUE EVER, EVER, EVER using the 'recursive' flag on rm.
It will eventually nail you, big time. ( The -i flag is overkill - use Method 1 with the file list cat for sure fire safe removal of the files. It gives you a file to be examined first.)
Safer, faster, more precise control in Unix.
Yes, SEEMINGLY more complex, but not really. The 'combining principles' are the same for ALL shells and ALL tools - allowing the same basic 'custom' design to be used for a multitude of solutions and combinations via different tools.
Attitude difference: Tiny tools to build complex, extremely flexible solutions (tar excluded from tiny and uncomplex),
or complex tools with zillions of options, and very specific, inflexible usage.
I just put a short perl wrapper around File::Xcopy in 'xcopy' in /user/local/bin, and now have xcopy for bash.
Originally posted by DConfusion I had to test xcopy to see that the 'globbing' (incorrectly from a logic standpoint) did as the original post suggested. The delete command in windows also does the logically fallacious same thing.
Hmm... I guess I could stick File::DosGlob in there.
There is no single right way to do anything in Unix.
I prefer the 'basic' tool kit approach, and really never, ever wish to 'duplicate' dos tools. *w*
Although a perl twinkie myself, some tasks are better done with simplicity. I use perl (started as a bsh script) for an extensive, exhaustive, robust clean up between two semi-duplicated directories. I have never found anything that really comes close to it in dos - and have looked at several dozen so-called 'synchronizing' or 'cleanup' programs. None of them really do the job properly, nor robustly. BUT - the perl script still goes down to the shell tool level for safety (sacrifice some speed). Perl handles all the book keeping, which it does beautifully.
The small tool approach I outlined will work from the command line of even the most basic of installs - ie. a stripped down system. Most of those tools are de rigour in even a base level install.
That said - I like your style, and appreciate the work you have done.