Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have a list of files produced by a find/grep, that are scattered about a directory tree. I want to copy them with parent directories intact to a directory, say /mytemp.
For one file this works,
cp --parents ./use/the/force/luke.txt /mytemp
It creates /mytemp/use/the/force/luke.txt.
So with xargs I tried
cat mylist.txt | xargs cp --parents .
But I get the complaint
cp: copying multiple files, but last argument `[path deleted]' is not a directory
Also tried, per some googling,
cat mylist.txt | xargs cp --parents {} .
cat mylist.txt | xargs cp --parents "{}" .
But those don't help either.
Is there a solution for this? Other than opening my list in vi and prepending every line with cp --parents and adding . to the end. That would be boring, even with regexp
Thanks,
Andy
p.s. I just noticed you can type xargs with only your left hand. Just like database! So it MUST be cool.
I have a list of files produced by a find/grep, that are scattered about a directory tree. I want to copy them with parent directories intact to a directory, say /mytemp.
For one file this works,
cp --parents ./use/the/force/luke.txt /mytemp
It creates /mytemp/use/the/force/luke.txt.
So with xargs I tried
cat mylist.txt | xargs cp --parents .
But I get the complaint
cp: copying multiple files, but last argument `[path deleted]' is not a directory
Also tried, per some googling,
cat mylist.txt | xargs cp --parents {} .
cat mylist.txt | xargs cp --parents "{}" .
But those don't help either.
Is there a solution for this? Other than opening my list in vi and prepending every line with cp --parents and adding . to the end. That would be boring, even with regexp
Thanks,
Andy
p.s. I just noticed you can type xargs with only your left hand. Just like database! So it MUST be cool.
First rule -- never use "xargs". Ever. Under any circumstances. It is a quick path to the dark side.
Second rule --- only write scripts whose operation you understand. Don't type and run something you don't understand just to see what happens.
There are a number of problems with your plan that you haven't even begun to think about. One obvious problem is that the destination directory might already have a file with the same name, which if not guarded against would cause the original to be overwritten. Another is that there might be spaces in the paths or file names.
Do it this way:
Code:
spath="/source/path"
dpath="/dest/path"
suffix_arg="\.html$"
find $spath -type f | grep -P "$suffix_arg" | while read path
do
# extract the file name from the path
fn="${path##*/}"
echo "Operation: $path -> $dpath/$fn"
# if the destination file already exists
if [ -e "$dpath/$fn" ]
then
echo "Error: File $fn already present on $dpath"
else
echo "copying $fn to $dpath ..."
# cp -p "$path" "$dpath/$fn"
fi
done
If you don't need to filter by file suffix, just remove the "grep" part of the pipe. Change the suffix filter and the source and destination paths to suit your situation. Once you are satisfied that the script works, uncomment the line that performs the copy.
Thanks for your reply. I've used that type of approach for more complex tasks but if I can do this in one line (well, after preparing mylist.txt which is really just the output of a find), then why not.
What do you have against xargs, if it does the job? Do you have similar reservations about find -exec? Obviously there's something about xargs I don't get, or I wouldn't have posted. But I don't see a good reason to run for the hills.
Destination directory is empty, so there wasn't much to think about there.
Spaces in file names is possible, I'll grant you. Using the -0 (null delimiter) option helps with that, e.g.
$touch "one two"
$touch three
$ls|xargs rm (or $find . -type f|xargs rm)
rm: cannot remove `one': No such file or directory
rm: cannot remove `two': No such file or directory
But
$touch "one two"
$touch three
$find . -type f -print0|xargs -0 rm
Both files are removed.
But ya, these are just a couple issues. If I had the types of constraints you have in mind and more, your solution would be superior.
In the case when the filenames may contain whitespace, you need to replace newlines with null characters, and use the correspinding -0 or --null argument of xargs.
Thanks for your reply. I've used that type of approach for more complex tasks but if I can do this in one line (well, after preparing mylist.txt which is really just the output of a find), then why not.
Why not? Because it leads to incomprehensible scripts. It leads to scripts that may as well be thrown away after they have filled a single purpose badly.
Your original effort didn't guard against one file overwriting another, and it wouldn't work for paths with spaces in them. Apart from the fact that your script wouldn't give you what you said you wanted and possessed a number of obvious dangers, it was fine.
When it comes down to it, I'm just asking a syntax question. Wasn't presenting my actual solution (or even the entire problem).
Thanks for the script you posted, it's in my desktop folder of linux examples I keep around for reference.
jschiwal's tr tip I didn't know about, I did run into that problem and I re-did my find with -print0 instead.
Thanks to all for your replies. Squares more or less with what I found later on wikipedia's xargs entry, of all places --
"
The above command uses -I to tell xargs to replace {} with the argument list. Note that not all versions of xargs supports the {} syntax. In those cases you may specify a string after -I that will be replaced, e.g.
Also learned that commands that fail with "argument list too long" can be handled by xargs, which will break the argument list into chunks thus running the command as few times as possible (thus is theoretically faster than find -exec at least when \; is used ... apparently on some systems you can terminate -exec with + and it works more like xargs). That's a hasty explanation; better ones are out there for the googling.
First rule -- never use "xargs". Ever. Under any circumstances. It is a quick path to the dark side.
That rule seems to be based on the fact that you don't know how to properly use it. Xargs is perfectly fine, and it not only usable, but also advised in this circumstances because it saves a lot of processing power.
Xargs can concentrate a lot of arguments into a single command, making sure that the max command length is not exceeded. The other alternative is to use a cp command for each file, which is possible, but far from optimal.
So, unless you give a *real* reason of why xargs is the devil, I will have to conclude that this is just based on your personal preference, and not in facts. jschiwal's and Tinkster's gave most of the info you need to overcome the main problems that people unfamiliar with scripting have some times.
Quote:
Second rule --- only write scripts whose operation you understand. Don't type and run something you don't understand just to see what happens.
I can only agree. But if he must seriously do that, at least I advice to do it in a confined environment and taking the due precautions.
In general, however, I agree that the approach the OP took is maybe not the best.
Quote:
Originally Posted by lutusp
Why not? Because it leads to incomprehensible scripts. It leads to scripts that may as well be thrown away after they have filled a single purpose badly.
The meaning of xargs is very well defined. You might be biased against it, but certainly it's not any more obscure than a cat file | while read line; is, once you know the syntax, of course. And, as said, it's light-years ahead in which regards efficiency.
There's nothing wrong with having a bias against something, we all have ours, I feel a natural aversion against a big part of the find syntax, for one. But I can't deny the usefulness of the command
Quote:
Your original effort didn't guard against one file overwriting another, and it wouldn't work for paths with spaces in them. Apart from the fact that your script wouldn't give you what you said you wanted and possessed a number of obvious dangers, it was fine.
Erm... And that has exactly nothing to do with xargs. It has to do with proper quoting, and using -i with cp or mv. I have no idea why would you blame xargs vs. cat|while read or vs. find -exec about that.
But as I said above, I don't think that these approaches are remotely the best. Ever heard of rsync?
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.