Find hundreds of file on many directory
Hi,
I want to search hundreds of files on many directory Is there any way that I can do this on single command ? Thanks |
what is the goal? have you tried ls -lR or the command: find ?
|
The "find" command is the usual candidate for that.
Please tell us some more about your search criteria and the scope of the search if you need a more customized answer. |
Hi all,
thanks for the reply The goal is simple, there's a bunch (hundreds) of files that I need to search on 9 path I know I can use find with -name file1 -o -name file2 I can afford it if it's only less than 5 files, but hundreds ? Maybe some combination of find, awk, xargs or something can help me, but I don't know how :) Thanks |
I still do not understand: do you have a list of filenames and you want to locate them, or ????
|
Quote:
Maybe I can explain more Let say I have file list_of_files.txt which the content are all files that I need to search Quote:
Thanks |
Code:
for i in $(cat list_of_files.txt); do locate $i; done |
Quote:
Thank you for the reply How do I define the path to be search ? |
locate has a database and will search inside. Therefore if this database is not uptodate it will not find those files. (you can scan your environment and collect info using updatedb). If you want to look for special directories you need to grep the result.
|
locate will be efficient because it will search only its own database, not the file system
But if you need other or more limited directories to be searched, you could do it directly - but if you are searching 1000s of subdirectories you may want to think about a depth-first versus breadth-first approach - by depth-first I mean search for one of the possible file names in all directories, then the next: Code:
for fn in $(cat list_of_files.txt); do find /dir1 /dir2 /dir3 -name $fn; done Code:
find /dir1 /dir2 /dir3 -type d -exec bash -c 'for fn in $(cat list_of_files.txt); do find "{}" -maxdepth 1 -name "$fn"; done' \; That said, if you're only checking the name, it should be fast enough anyway as find won't have to stat anything. (In the above /dir1 /dir2 /dir3 is the list of directories you want to search in, and I assume like Didier that you've got a list_of_files.txt. If some of the possible file names contain spaces, they'll need to be quoted.) |
Note:
Code:
never do this: for file in $(cat filelist); do ... done 1. 'filelist' might be large 2. filenames might contain spaces |
Well, honestly, I would use find and then grep.
Code:
bash-4.2$ find /usr -type f > usr.txt |
Do an updatedb once, and then run locate once for each file in the list.
EDIT: In other words, what Didier Spaier said. You don't need to "define the path to be searched" (and this should be apparent if you know how updatedb and locate work). |
Quote:
So it's apparent, if you know how updatedb and locate work, that you might need to define the set of paths to be searched. |
you can also try this ..
Code:
while read line; do echo "Finding file $line .."; find / -name "$line" 2>/dev/null; done < list_of_files.txt |
All times are GMT -5. The time now is 09:06 AM. |