help! i need advice and suggestions for fast processing
GeneralThis forum is for non-technical general discussion which can include both Linux and non-Linux topics. Have fun!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
help! i need advice and suggestions for fast processing
i need advice on what to do.
here is the problem:
i wan't to process huge amounts of log files and i want to know if there's is a better way to do this.
i tried using PERL and it takes a VERY LONG time to finish a 2 gig log file.
Are these plain text log files? and what are your system specs? more ram, more processor would help. Define long, two gigs is a lot of stuff to process. Perl is the best log processing language I have ever used. Another recommendation would be to see if there is any way to optimize your code, the little things add up into big help. Splitting the log file into smaller chunks that could be processed concurrently would also help.
yep, just plain log files. system specs: pentium 3 1.14GHz 512Mb ram.
i can try setting up an openMOSIX cluster but would that help? i'm processing many log files about that size and i want to find a way to process it efficiently. can i process like 2 files at once in perl(using clustering)?
Oh... it's probably going to take a while to process a two gig log no matter what you do, especially because it has to be removed from the tar archive (which, unless it was compressed, would of necessity be spread across several cd's), copied to a hard-disk (I assume) and then processed. Sorry about the lack of help, I wasn't expecting that particular format for the logs (I assumed they were on the computer that generated them, whole different story). Best I can tell, the part that is probably taking the longest is decompressing the logs. You could reduce the amount of overhead on the system by processing them in non-gui mode and have only the bare minimum of processes running--but this is like telling you to drink plenty of water and get some rest .
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.