[SOLVED] find: Include "maxima"; exclude two folders
Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
-type d includes all directories that are not under /proc and /var.
Since you use -o, the -type f and -iname parameters are not affected by the path parameters.
So, try again after removing -o and -type d.
Quote:
I keep having problems with find/grep, it seems they are quite harder to learn than other commands.
To me as well, find sometimes looks like an incoherent hodge-podge of arbitrary commands. grep on the other hand is based on regular expressions, which are not necessarily easy to read but rather well-defined and logical. Many other utilities use regular expressions, so it's worth your while studying this topic.
Last edited by berndbausch; 01-07-2016 at 07:21 PM.
Interesting equation(syntax) and even though I just learned the prune option for rsync recently, it just didn't occur to me that find would also have this option. Thanks a lot for sharing this.
Trying this equation out, I decided to add a third argument (excluding themes) to the prune section:
Why on "trial 02" did I get a slew of "Permission denied" error messages, yet on "trial 01" literally for hundreds of results, the only error message I got was the expected:
Code:
`/lost+found': Permission denied
Can "find / \( 1 2 3 \) -prune" only handle 3 arguments at the most?
Last edited by andrew.comly; 01-09-2016 at 07:28 AM.
Reason: missing text
The only limit is the 128 KiB system limit on the total length of args + environment. Your shell would let you know if you were exceeding that.
None of those "Permission denied" messages appear to be from anything you have excluded with "-prune". Are you sure you weren't getting those messages before and just having them scroll off the screen because of the volume of output? Try running one of the earlier find commands again with ">/dev/null" appended so that all you see is the stderr output.
Are you sure you weren't getting those messages before and just having them scroll off the screen because of the volume of output? Try running one of the earlier find commands again with ">/dev/null" appended so that all you see is the stderr output.
I must be doing something wrong with the >/dev/null on the back because the command
That second case, with the redirection to /dev/null, is behaving as expected. I don't know why you are not seeing those same error messages otherwise. Are you perhaps running these commands in a script with stdout redirected to a file or perhaps being stored in a variable? Those messages are being sent to stderr and won't show up in the stdout stream.
Notice above how I mentioned "/media/a/Sea..." twice (scroll above codebox right). This is because I have two hard drives whose labels start with "Sea_". When I try to abbreviate this with anything in "/media/a/Sea*", it produces the 'help' error message:
To new users of find it seems that globbing doesn't work when you only have one letter followed by a wildcard. However if you use weak quotes it can! In this following example I prune away:
1) hidden directories starting with
i) "m"; ii) "c"; iii) "s": and
2) ffmpeg:
Finally I can also search for just the *.desktop files in all the subdirectories of /media/a/Sea_ext4/recent/AC/bckup/Install/3_UbuntuLXDE14.04/home/a/.config/cairo-dock/ that start with the two letters "th":
When you are having problems with globbing, it's always helpful to set the shell's "-x" option so that you can see what arguments are actually being passed to the command. When you run
That "/media/a/Sea_ntfs" isn't associated with any "-path" operator, so the find command takes it as a search path that should have come first, before the expression.
why does bash shell insist on putting strong quotes around variables?
Quote:
Originally Posted by rknichols
When you are having problems with globbing, it's always helpful to set the shell's "-x" option so that you can see what arguments are actually being passed to the command. When you run
The relevant paragraph from the bash manpage is under EXPANSION:
The order of expansions is: brace expansion, tilde expansion, parameter, variable and arithmetic expansion and command substitution (done in a left-to-right fashion), word splitting, and pathname expansion.
Brace expansion is performed before variable expansion, so "{${ALPHA}}" isn't seen as a legitimate brace expansion (there's no comma inside those braces) and is left as-is. (The output from "-x" puts quotes around it since it contains characters that are special to the shell.)
You can force the expression to be re-parsed after variable expansion by using eval, but that is extremely dangerous unless you have absolute confidence that the string being passed to eval can't contain anything dangerous like content from an outside source, since that could include things like command substitution running arbitrary commands. If you are sure that the content of ALPHA is always safe, then
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.