I have a Perl CGI program to upload zipped files to a server directory.
A temporary file is created in the /var/tmp/ directory. From my Googling I have learned that this file is supposed to get deleted automatically.
In my case this is not happening, hard disk space is getting squeezed. My client had the system up for over a month and they suddenly ran out of disk space. That is when I got the call.
The uploading itself is working fine.
The environment is Apache running on Linux.
My questions:
1) What could possibly be causing this? Is there a setting somewhere, either in CGI, or Apache, or in the OS?
2) If the cause cannot be determined, how do I delete this each time the upload finishes?
Question 1 is the more important of the two. I would rather know the root cause and tackle that, than apply a bandaid solution.
Here is my code.
Form
Code:
<form action="do_upload.cgi" enctype="multipart/form-data" method=POST>
<input type="hidden" name="MAX_FILE_SIZE" value="10000000">
<input type="hidden" name="action" value="DoUpload">
Select file <input type="file" name="uploadFile" size="32"><br>
<input type="submit" value="Upload!">
</form>
Uploader code
Code:
#!/usr/bin/perl
use strict;
use CGI ':standard';
my $dirName = "ZIPData";
my $uploadFile = param( 'uploadFile' );
my @pathArr = split( /\\/, $uploadFile );
my $pathEltCnt = scalar( @pathArr );
my $fileName = $pathArr[$pathEltCnt-1];
# various validation here
open ( UPFILE, "> $dirName/$fileName" ) || die("Cannot open($fileName): $!");
binmode( UPFILE );
my ( $data, $chunk );
my $fileSize = 0;
while ( $chunk = read ( $uploadFile, $data, 1024 ) ) {
print UPFILE $data;
$fileSize += $chunk;
}
close ( UPFILE ) || die("Cannot close($fileName): $!");
Thanks!