xcopy functionality with linux?
I need to do a selective and recursive move. the function I am after is done like this in windows:
xcopy c:\music\*.mp3 d:\music\ /s del *.mp3 /s I can probably make a script using find and xargs but I thought I would ask first. I have looked at mmv and unison, (and rsync) none of those are what I am looking for. rsync is dooable, but it would copy the files and then delete the source, as opposed to simply moving them (And retaining the original inode) |
The command to move is mv(move). The command to copy is cp(copy). The command for recursive copying is recursive(-R or -r). The command to delete is rm(remove). The command to delete all subdirectories is force(-f).
cp -R /usr/local/mp3 /home/spackle mv /usr/local/mp3 /home/spackle rm -rf /home/spackle/mp3 |
Quote:
Thanks, but i already new that much.... you missed the bit about "selective" say for example i have 10,000 files, and only 1000 end in .txt, how do i *selectively* move those files that end in txt? |
mv /usr/local/mp3/*.txt /home/spackle/mp3
|
rofl.... ok now you missed the recursive part....
if i have 100s of sub directories then how do i do it? of course many of those directories will contain the files i want to *selectively* *recursively* move... {EDIT} http://www.shellscripts.org/project/mvpartial i found an answer: #!/bin/bash if [ $# -ne 3 ] ; then echo "Usage:" echo " $0 <sourcedir> <targetdir> <condition>" echo " Make sure to put the condition into apostrophes to avoid them" echo " being interpreted by your shell" echo echo "Example:" echo " $0 /usr/src/foo /usr/src/bar 'baz*'" echo exit 1 fi source=${1%/} target=${2%/} regex=$3 if [ ! -d "$source" ] ; then echo "ERROR: $source is not a directory!" exit 1 fi if [ ! -d "$target" ] ; then echo "ERROR: $target is not a directory!" exit 1 fi while read file ; do dirname=$( dirname "$file" ) dirname=${dirname#${source}/} mkdir -p "${target}/${dirname}" mv "${file}" "${target}/${dirname}" done < <( find "$source" -name "$regex" ) |
Quote:
Code:
mv /src/path/**/*.txt /dest/path |
oh very nice, thank you... we are almost there =)
the zsh recursive globing works, but if does not retain the original directory hierarchy example: [ew@client01]~/raid% find source source source/test.txt source/2ndlevel source/2ndlevel/test.txt [ew@client01]~/raid% mv source/**/*.txt dest/ mv: will not overwrite just-created `dest/test.txt' with `source/test.txt' |
Quote:
Code:
zmv -p mvmkdir -w '/src/path/**/*.txt' /dest/path/'$1$2' Code:
mvmkdir() { [[ ! -d $3 ]] && mkdir $3 && mv $@; } Bugger, ignore the above mvmkdir function -- brain cramp. This should be correct: Code:
mvmkdir() |
Perl's File::Xcopy is a DOS xcopy on steroids.
http://search.cpan.org/~geotiger/Fil...-0.12/Xcopy.pm |
re: xcopy & del ->
Well, close but no cigar so far. I had to test xcopy to see that the 'globbing' (incorrectly from a logic standpoint) did as the original post suggested. The delete command in windows also does the logically fallacious same thing.
To selectively copy, recursively in unix, there is NOT a single hybrid command that accomplishes such. BUT, the beauty of unix is that the toolkit is robust. Break the job into smaller parts. First, identify the files that need copying. Then select those files (that is what 'globbing' short-cuts - combining these two issues.) Then, MOVE them, rather than copying, so that no subsequent 'del' is required. HOWEVER: The most robust manner to copy files from location A to location B in Unix is the old standby 'tar'. The most robust manner of identifying files, recursively, is find. The easiest manner of simulating globbing (ie. wildcard matching) is to use pattern matching - eg. patterns in grep/egrep/fgrep. (Pattern matching is far more powerful than globbing.) Avoid globbing with 'thousands of files' and 'hundreds of directories'. Your shell can go belly up. Csh used to have a limit of around 500,000 bytes being matched by globbing, bsh was well over 1,000,000 bytes. Always a difficult limitation to identify, unless one is familiar with it. Not to mention, that 'globbing' that much is extremely slow. Base level unix tools are much more efficient, and tool pattern matching is much more efficient, and flexible than shell globbing. Method 1: cd c:/mp3 find . > /tmp/t0 egrep '\.mp3$' /tmp/t0 > /tmp/t1 tar -cf `cat /tmp/t1` | ( cd d:/newdirectory ; tar -xf - ) rm -i /tmp/t0 /tmp/t1 # clean-up one's cruft # Finished - note - the above is 'overkill' - and ONLY covers the 'xcopy' equiv. # The detail was provided above so that the following equivs might make sense, as to method. On one line - the 'xcopy' equiv: musicfiend> cd c:/mp3 ; tar -cf - `find . | grep '\.[Mm][Pp]3$'` | ( cd d:/newdirectory ; tar -xf - ) On one line - the 'del' equiv (complete with verification of every single file to be removed *s* ) musicfiend> rm -i `find c:/mp3 | grep '\.[Mm][Pp]3$'` AVOID LIKE THE PLAGUE EVER, EVER, EVER using the 'recursive' flag on rm. It will eventually nail you, big time. ( The -i flag is overkill - use Method 1 with the file list cat for sure fire safe removal of the files. It gives you a file to be examined first.) Safer, faster, more precise control in Unix. Yes, SEEMINGLY more complex, but not really. The 'combining principles' are the same for ALL shells and ALL tools - allowing the same basic 'custom' design to be used for a multitude of solutions and combinations via different tools. Attitude difference: Tiny tools to build complex, extremely flexible solutions (tar excluded from tiny and uncomplex), or complex tools with zillions of options, and very specific, inflexible usage. Just my opinion from the soap box. |
Re: re: xcopy & del ->
I just put a short perl wrapper around File::Xcopy in 'xcopy' in /user/local/bin, and now have xcopy for bash.
Quote:
*will stop perl advocacy now* :-D |
Re: perl advocacy
LOL -
Yes, one can do it that way. Another Unix axiom - from Larry, I believe. There is no single right way to do anything in Unix. I prefer the 'basic' tool kit approach, and really never, ever wish to 'duplicate' dos tools. *w* Although a perl twinkie myself, some tasks are better done with simplicity. I use perl (started as a bsh script) for an extensive, exhaustive, robust clean up between two semi-duplicated directories. I have never found anything that really comes close to it in dos - and have looked at several dozen so-called 'synchronizing' or 'cleanup' programs. None of them really do the job properly, nor robustly. BUT - the perl script still goes down to the shell tool level for safety (sacrifice some speed). Perl handles all the book keeping, which it does beautifully. The small tool approach I outlined will work from the command line of even the most basic of installs - ie. a stripped down system. Most of those tools are de rigour in even a base level install. That said - I like your style, and appreciate the work you have done. |
To copy selected files use cpio:
Code:
find /blah -name "*.txt" -print | cpio -pvdum /target/dir |
All times are GMT -5. The time now is 03:08 AM. |