[SOLVED] Read large text files (~10GB), parse for columns, output
ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
This coming from an MCSE who gets down on his knees and prays to his gods: Ballmer and Gates. Use Linux and you don't have to spend your weekends re-installing your relatives' computers. Antivirus 2009 LOLOLOLOL.
Oh I know let's re-use the same horrible kernel over and over and just put a different UI over it. Leave the real computer science to the computer scientists and enjoy your sheltered existence at the help desk.
If it works why create a new kernel? At least I have a job. Most companies don't use linux and if they do they use it because they have no real budget. So how well does McDonalds pay?
McDonalds is a multinational corporation with more locations than whatever lame .NET fail company you work for. Which business will survive the recession?
Microsoft has billions in the bank and sells their products for a good profit. How much profit do you get for a Linux download/big mac? More Linux companies have went under then Microsoft has sold in copies of Windows.
.NET is setting the standard out there. If the original poster was smart he would use that over C or PERL. It's so much better!
If the OP paid for a Microsoft OS/compiler/rip-off, would they solve his query for free? Or would they try to nickel and dime even more money out of him? More like Windows Genuine FAILAGE, imo.
Stick to the subject, please... "Cheap beer and forums do not mix."
No, it probably won't be "better than awk."
"awk" is a very well-written program that is specialized for doing what you are doing.
All of the delays associated with this task will be mechanical ones: disk I/O times and network time. But "awk" knows to tell the operating-system that the file is being read sequentially, and therefore the operating system will know how to line-up lots of file buffers and other tricks to streamline the operation as much as the hardware will allow.
If the time required to do this task is problematic to the business, then there are various things that you can do:
Invest in fast storage-hardware... SATA, FireWire.
Instead of using the disk controllers built into the motherboard, buy a controller card. An inexpensive unit can make a dramatic difference.
Put the input file and the output file on different disk volumes.
Do not follow the siren that says, "put it all in memory..." Abandon all hope, ye who enter there!
Face it: when you're dealing with 10 gigabytes of data, "some things take time." If you're doing the task in "awk," and doing it well, then you are using a robust tool that was specifically designed for the task. You have not erred in the approach that you are using right now. "Diddling with it" will not improve it.
Last edited by sundialsvcs; 04-07-2009 at 09:45 AM.
For the record, it would be unfortunate to lock the whole thread. The question (How do I deal with a mega-sized file and the associated I/O problems?) is a serious one and deserves some discussion.
Last edited by Telemachos; 04-07-2009 at 09:46 AM.
Stick to the subject, please... "Cheap beer and forums do not mix."
Sorry, I just get frustrated when people reply with stupid responses that are irrelevant to the original issue ("use an interpreted language", "perl can do regex", "windows > linux", etc). The last one strikes a nerve as you can imagine ;]
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.