If none of the names had whitespace in them then a "rm *_#[[:digit:]][[:digit:]]" command would do it.
For the one example you have "Sam Spade Features email lookup_#12" this won't work, because the spaces will cause the shell to consider them to be separate filenames: Sam, Spade, Features, email and lookup_#12.
Another potential problem is if you have something like 30,000 filenames that could contain the pattern. That would cause a memory error when expanded on the command line for "rm".
You can solve both problems using find and xargs. The -print0 option of find, causes it to separate each result with a null character instead of a line return. The -0 option of xargs tells it to read in arguments that are separated with the null character.
find ./ -name "*_#[[:digit:]][[:digit:]]" -print0 | xargs -0 rm
If the current directory contains subdirectories that you don't want to traverse, then you need to add the '-maxdepth 1' argument.
find ./ -maxdepth 1 -name "*_#[[:digit:]][[:digit:]]" -print0 | xargs -0 rm
Look in the man page of xargs. There are different options on how to limit the size of the input.
The find example I used assumed you wanted to start in the current directory. You could use something like "$HOME/Documents/" instead.
Last edited by jschiwal; 02-28-2006 at 04:22 AM.
|