Hey guys thanks for the replies.
Quote:
Looking at sourceforge, the last e3comp update has almost 1 year ago.
|
That was definitely one trend I noticed while searching around for a solution. I think one of them only mentioned the ext2 filesystem and I'm running ext3.
Quote:
That's not much (my system has something between 70000..130000 files, and it isn't on 1 terabyte drive). I'd start to worry if there were few dozens of millions of them.
|
Well, as of 7:59am PST, the drive is currently storing 394,860 files. That slightly exceeds your 130,000 :P
We're currently adding between 1,000 - 4,000 new files per day, and that number is expected to go up.
The other catch is that we make these files downloadable via a web interface. Another drawback to gzipping each of the files individually is that we'd then have to gunzip each file before we pass them to the user for download (end users seem to have enough trouble handling plain text files, asking them to unzip a file first would be unreasonable at this scope).
The other issue is that this application, the scripts that retrieve the data, the web site and the database are all being run from a single machine. So server load is definitely a concern.
I've setup a few test CentOS 4.7 VM's on my vm server at home. I think the next logical step would be to setup a few configurations in a test environment. The only thing I wouldn't be able to mirror at home is the fact that the production drive is actually a partition on a san, and not a local hard disk array.
I'll post an update later this week after I've done a few tests.
Thanks!