-   Linux - Newbie (
-   -   how to rm backup files recursively (

wazoo 12-28-2004 03:23 PM

how to rm backup files recursively
I think as a result of my rsync attempts, I have tons of backup files -- files with names like "foobar~" or "foobar~~" etc. They are all in my home directory and subdirectories.

I tried typing "rm -Rf *~" but it only seems to work one directory down, then go no further.

Is there some command -- and I recognize that I really need to be careful about adding that tilde or I can wipe out ALL my files -- that will delete all those unnecessary backups?

mearley 12-28-2004 04:00 PM

I think the problem that you are running into is that you are specifying the filename that you want to remove *~, but that will only cause it to remove files and directories in the top directory that have the name 'something~'.

Not sure how you can do it from one command. In the past I wrote a short script to clean out subdirectories of unwanted files.

The following script works, if it is in the top level directory where you want to start removing files.

$execline = "find . -name '*~'";

open (INPUT, "$execline |");
while (<INPUT>)
chomp $_;
`rm $_`;
close (INPUT);

wazoo 12-28-2004 05:23 PM

Well, that was interesting. It didn't eliminate a good many files -- but that's because I have followed the bad habit of accepting files from others with spaces in them. So your script reports several errors in a file like "do dah day.~" -- it will tell me that it can't delete do, dah, or day.~ -- because of course those files don't exist. But it got rid of a lot of them.

Thanks for the time, and the programming expertise.

Ebel 12-28-2004 06:55 PM

Perhaps you could use find:
find . -iname '*~' -exec rm -f {}

That might work. I haven't tested it, so you should test it on some test files first.

mearley 12-31-2004 05:18 AM

Minor adjustment to script
The following will do the trick... If the previous command line does not work.

$execline = "find . -name '*~'";

open (INPUT, "$execline |");
while (<INPUT> )
chomp $_;
`rm "$_"`;
close (INPUT);

jschiwal 12-31-2004 05:57 AM

Try escaping the '~' character with a backslash. The ~ character is used to represent your home directory so that could cause problems.

This oneliner has been tested:
find ./ -name "*\~" -exec rm -v {} \;

I take it back. The '~' character doesn't expand to your home directory inside of single or double quotes, but the '\~' didn't change the result.
Single or double quotes are both ok around the *\~ filename wildcard expression. If the expression involved a variable such as
find ./ -name "$1\~" -exec rm -v {} \;
then you would need to use double quotes.

wazoo 12-31-2004 10:03 AM

jschiwal, that worked like a charm. One of the other suggestions I got from somebody did indeed wipe out a lot of home "." files, so I did test this one first, and it worked great.

I'm saving your command line in a file I call "clean." Then I "chmod 777 clean" to make it an routine executable. Hmm. Guess I could move it over to cron, too.

Thanks, everybody, that's a genuinely useful discovery that will help me not to clutter up my computer and backup files.

All times are GMT -5. The time now is 08:48 AM.