Is that 6000 files just for March?
You could run a loop and processing it by each day individually.
Code:
for day in {01..31}; do
cp -f "/Src/FULL_^RRRR^_IP_2013-03-$day"* /Dest/
done
Notice also how you can quote-protect the fixed parts of the string while leaving the globbing characters open for expansion. It's not really needed here, however, since there are no reserved characters in the string. But it never hurts.
You may have to get a bit fancier to avoid the unnecessary days on short months, however. Perhaps we should use an intermediate array instead.
Code:
shopt -s nullglob
for day in {01..31}; do
array=( "/Src/FULL_^RRRR^_IP_2013-03-$day"* )
[[ -n $array ]] && cp -f "${array[@]}" /Dest/
done
But that's getting to be a lot of work, and there's still no guarantee that you won't hit ARG_MAX, so I suppose
find might still be best choice.
I recommend modifying it to this, however:
Code:
fileprefix='/Src/FULL_^RRRR^_IP_'
filedate='2013-03'
destination='/destdir'
find . -maxdepth 1 -name "$fileprefix$filedate*" -exec cp -t "$destination" '{}' +
The '
+' ending for the
-exec option enables
xargs-style batch processing mode. It will run only as many instances of the command as necessary to avoid the ARG_MAX limit. The main limitation is that the '
{}' brackets must then come at the
end of the command (it expands to the full list of filenames), so for
cp or
mv you need to use the
-t target directory option to move it to the front.
It's also recommended to avoid hard-coding data strings into your commands. Set variables at the top of your script (or pass them to it from outside), and use those in the actual commands.