I need to process about 10gb of zipped files with a very small Perl script I'm writing. The problem is, I can't seem to get it to pipe properly. The command I'm using is this:
$ unzip -p /mnt/cdrom/MyBigFile.zip | ProxyParser.pl
Right now, it's just printing everything to stdout:
Code:
#!/usr/bin/perl -w
foreach(<STDIN>)
{
print "$_";
}
The problem is, it seems to be unzipping the entire file in memory and not sending it to my perl script. Because of the size, I'm running out of memory:
Code:
iago@Slayer:~$ dmesg | grep ProxyParser.pl
Out of Memory: Killed process 14423 (ProxyParser.pl).
Out of Memory: Killed process 14432 (ProxyParser.pl).
iago@Slayer:~$
This problem also happens on Windows, where I'm using a similar unzip program. My goal is for this to run on Windows, since that's what our server is running. But getting it to work right on Linux would be a good step forward.
Any help is appreciated!
Thanks,
-iago
<edit> I noticed that I can pipe it into "more" just fine, but not into my perl script. So Apparently, I can't use foreach(<STDIN>) to parse it in Perl. Any idea how I can do it properly?