@ Mike: I see now that your original solution didn't have the problem I thought it did.
Code:
sub findOldFile {
return unless -f $_&& -M $_ > $oldestDate;
$oldestDate = -M $_;
$oldestName = $File::Find::name;
}
#
# Keep deleting files until partition is less than, or equal to $maxSize
#
until ($partSize <= $maxSize) {
find (\&fildOldFile, "$DiskDir");
unlink($oldestName);
$partSize1 = `( /bin/df '$DiskPart' | /bin/egrep -v -i filesystem )`;
@partSizeArray = split(' ', $partSize1);
$partSize = `( /bin/echo '$partSizeArray[4]' | /usr/bin/tr -d % )`;
}
Since you're using File::Find only to find the oldest item, your version doesn't have the problem in logic that I was worried about. However, it does have one performance issue worth thinking about
: as the script runs, you repeatedly call
findOldFile to check for the oldest file. But unless your script runs for a very long time, that's redoing a lot of work for no reason. Instead of doing that, I would gather that information once and then cycle through it using a loop of some kind.
Here's an example of what I mean:
Code:
#!/usr/bin/env perl
use strict;
use warnings;
use File::Find;
my @files;
my $dir = '/Users/hektor/unix.varia';
sub by_age {
return unless -f $_;
push @files, [ $File::Find::name => -M $_ ];
}
find \&by_age, $dir;
@files = sort { $b->[1] <=> $a->[1] } @files;
for my $item (@files) {
my $file = $item->[0];
my $age = $item->[1];
print "File: $file => age: $age\n";
}
The find subroutine builds an array of arrays in
@files. Each subarray in
@files has one file's complete path and its age. Then we sort the outer array by the
age of the items in the subarrays (in descending order - oldest to newest). This way, we only have to do all the age checks
once, and then you can go on with the rest of the program. (A loop, delete items, check size of partition, break out when the size overall is small enough.)
All of that said, Ghostdog's suggestion is good too. The Filesys::Df module looks pretty good. (No cap on the 's' of 'sys', by the way.)