xargs and cp --parents
I have a list of files produced by a find/grep, that are scattered about a directory tree. I want to copy them with parent directories intact to a directory, say /mytemp.
For one file this works, cp --parents ./use/the/force/luke.txt /mytemp It creates /mytemp/use/the/force/luke.txt. So with xargs I tried cat mylist.txt | xargs cp --parents . But I get the complaint cp: copying multiple files, but last argument `[path deleted]' is not a directory Also tried, per some googling, cat mylist.txt | xargs cp --parents {} . cat mylist.txt | xargs cp --parents "{}" . But those don't help either. Is there a solution for this? Other than opening my list in vi and prepending every line with cp --parents and adding . to the end. That would be boring, even with regexp ;) Thanks, Andy p.s. I just noticed you can type xargs with only your left hand. Just like database! So it MUST be cool. |
Quote:
Second rule --- only write scripts whose operation you understand. Don't type and run something you don't understand just to see what happens. There are a number of problems with your plan that you haven't even begun to think about. One obvious problem is that the destination directory might already have a file with the same name, which if not guarded against would cause the original to be overwritten. Another is that there might be spaces in the paths or file names. Do it this way: Code:
spath="/source/path" |
ah, leave poor xargs alone
Thanks for your reply. I've used that type of approach for more complex tasks but if I can do this in one line (well, after preparing mylist.txt which is really just the output of a find), then why not.
What do you have against xargs, if it does the job? Do you have similar reservations about find -exec? Obviously there's something about xargs I don't get, or I wouldn't have posted. But I don't see a good reason to run for the hills. Destination directory is empty, so there wasn't much to think about there. Spaces in file names is possible, I'll grant you. Using the -0 (null delimiter) option helps with that, e.g. $touch "one two" $touch three $ls|xargs rm (or $find . -type f|xargs rm) rm: cannot remove `one': No such file or directory rm: cannot remove `two': No such file or directory But $touch "one two" $touch three $find . -type f -print0|xargs -0 rm Both files are removed. But ya, these are just a couple issues. If I had the types of constraints you have in mind and more, your solution would be superior. |
In xargs use with {} you need to specify -i as a param
.. | xargs -i cp -p "{}" . And there's indeed nothing wrong with xargs. I don't know what ltusp's problem with it is, either. ;} Cheers, Tink |
In the case when the filenames may contain whitespace, you need to replace newlines with null characters, and use the correspinding -0 or --null argument of xargs.
find directory/ -type f -name "aname*" -print0 | xargs cp -t destination_dir From a list as you have, use the tr command to repace newlines to nulls: cat filelist.txt | tr '\n' '\0' | xargs -0 cp -t destination_dir |
Quote:
Your original effort didn't guard against one file overwriting another, and it wouldn't work for paths with spaces in them. Apart from the fact that your script wouldn't give you what you said you wanted and possessed a number of obvious dangers, it was fine. |
When it comes down to it, I'm just asking a syntax question. Wasn't presenting my actual solution (or even the entire problem).
Thanks for the script you posted, it's in my desktop folder of linux examples I keep around for reference. jschiwal's tr tip I didn't know about, I did run into that problem and I re-did my find with -print0 instead. Thanks to all for your replies. Squares more or less with what I found later on wikipedia's xargs entry, of all places -- " The above command uses -I to tell xargs to replace {} with the argument list. Note that not all versions of xargs supports the {} syntax. In those cases you may specify a string after -I that will be replaced, e.g. find . -name "*.foo" -print0 | xargs -0 -I xxx mv xxx /tmp/trash " Also learned that commands that fail with "argument list too long" can be handled by xargs, which will break the argument list into chunks thus running the command as few times as possible (thus is theoretically faster than find -exec at least when \; is used ... apparently on some systems you can terminate -exec with + and it works more like xargs). That's a hasty explanation; better ones are out there for the googling. |
Quote:
Xargs can concentrate a lot of arguments into a single command, making sure that the max command length is not exceeded. The other alternative is to use a cp command for each file, which is possible, but far from optimal. So, unless you give a *real* reason of why xargs is the devil, I will have to conclude that this is just based on your personal preference, and not in facts. jschiwal's and Tinkster's gave most of the info you need to overcome the main problems that people unfamiliar with scripting have some times. Quote:
In general, however, I agree that the approach the OP took is maybe not the best. Quote:
There's nothing wrong with having a bias against something, we all have ours, I feel a natural aversion against a big part of the find syntax, for one. But I can't deny the usefulness of the command :) Quote:
But as I said above, I don't think that these approaches are remotely the best. Ever heard of rsync? |
All times are GMT -5. The time now is 05:13 PM. |