Bash: recursion = bad?
Hello all,
new Bash programmer here. I've dealt with other languages in the past, but I honestly think that's not to my advantage right now. GOAL: Starting in a directory, search it and all sub directories for any files containing a certain text string, then copy those files to another directory. There's about 1GB worth of files (all email messages) I want to search through. PROBLEM: Since I don't know the depth of the folders, my instinct was to create a recursive script. Unfortunately, I have to run this on a server... and I'm guessing it'll completely kill everything. I'm hoping there's a cleaner solution? Code:
#!/bin/bash |
Code:
man find Code:
find /mnt/home/user/.mailbox/ -iname "string" -type f -exec mv '{}' /mnt/home/user/thefolder/ \; |
doesn't find -iname just search for the file name? I need to check text inside the file itself, or perhaps I'm reading the man wrong..?
The check is rather complicated, too. If any part of the "to" field matches three or more of a set of 10 users, it's considered a hit. I can use regular expressions to accomplish this, but again I don't think find can do this inside of a file. Or am I missing something? |
You could run the script via the nice command to reduce its impact on the server and if execution time is not important you could add a sleep command in the script too.
You could use the file command to restrict the search to appropriate file types. You could use find to run the script in each directory but that gains little over recursion. |
Have a look at grep -R option
This is a recursive grep. If you combine it with the -l option you get a list of all files containing the search string. |
My mistake, I misread your post and thought that you only needed to search the filenames.
Code:
find . -type f | xargs grep -s -l "test" | xargs -I {} cp {} ~/ |
+1 to A. Thyssen - and if you use the -f option with the name of the 10 users in a file you can cover that part too :)
|
Awesome, thanks guys!
I was able to rewrite it using grep -rl; definitely cleaned things up a lot! @grail: I wasn't able to find any documentation on the "-f" option; maybe I'm not looking in the right place (man grep)? @Super: I'm going to come back to this; it's taking me a while to decipher. Thanks! |
Try this :)
Quote:
|
Yes to 'man grep':
Code:
-f FILE, --file=FILE |
rightyo- I've restructured my program after learning about the tools you all mentioned here; thanks!
|
Can you post it?
|
It's still broken, but with other problems :p
In essence, what I did was: 1) run a find to get a list of all files in the applicable folders (current problem, see other post) allFiles=$(find $src -print) 2) grep through the files to see if they had the string I wanted, and save the file paths of the matches to a variable files=$(grep -il -E "^To:.*($bod|$admin)" $allFiles) 3) iterate through the matches to modify the files, for f in $files; do ... 4) then copy them over cp "$fSrc" "$dest$fName" Thanks again! I've learned a ton so far because of this little project! |
Code:
allFiles=$(find $src -print) Code:
files=$(grep -il -E "^To:.*($bod|$admin)" $allFiles) Code:
for f in $files; do You can also use this sort of loop: Code:
while IFS="" read -r -d "" file ; do |
Quote:
|
All times are GMT -5. The time now is 09:38 PM. |