using -exec with find will be slower than using xargs because a grep process will be spawned for every file found. Using xargs, you'll only spawn one grep process for a large group of files, which can be a lot more efficient, especially if you have a lot of small files.
However the xargs command above will have problems when file names have white-space characters in them. You can ask find to de-limit it's output with the ASCII null character using the -print0 option, and tell xargs to expect this format with the -0 option.
grep will match "." against any character, but you're wanting to say a literal ".". With fixed strings like this, fgrep is slightly more optimised, so it should be preferred for large matches when you're not interested in regular expression patterns.
One more thing, specifiying the -name '*.*' will match all files with a . in. If you want all files, you simply don't need to specify a -name option.
So here's the modified command:
Code:
find / -type f -print0 | xargs -0 fgrep -H '69.69.69.69'
The -H option to fgrep says that the filename will be printed with any matched input line, even if there is only one input file. The input filename is printed by default if there is more than one input file, which is usually going to be the case with find + xargs, but I kept this because there is a small chance that the
last invocation of fgrep by xargs might have just one file passed to it.