Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
I'm trying to make some changes to an extremely long file (hundreds of MB). Is there some kind of editor that will efficiently deal with a file of that size and allow me to make the changes, and then save them without it taking hours (as it seems to be doing with vi)? I tried to save it with nano, I think it was, last night and it crashed the computer! Not something I expected from Linux....
Strange, at most is should have crashed nano (nano is not the most stable program, that's old news).
I really wonder what do you have inside a many hundred mb's text file. It's not the most optiomal format to store big chunks of information. But that's another topic.
Kate is said to work well with these, but, sincerely, if you need to load and write a file that takes many hundred mb's, no editor is going to be blazing fast. I never dealt with a similar situation with plain text files.
FYI, the file is a dump of a Postgres DB pg_dumpall output (so it's the schema and contents of 2 moderately big DBs). I would not normally mess with a text file of this size, but it's the way I got it and short of redoing the dump using a more effective method (which I'd have to tell the guy how to do) and downloading it over again, I was hoping to find an editor that would just load a modest amount into a buffer and leave the rest out on the disk....
FYI 2: I need to comment out (or delete) some lines from the file--this kind of dump is the DB equivalent of a bare-metal restore and I am not putting it on a new Postgres install so I need to handle some of the stuff externally rather than in the script.
I also think that scripting might be a better way to do this. You can select lines using awk, sed, grep or whatever and redirect the ones you want to a file.
Manually editing a test file of many hundred mb's is going to take you a very very long time. On the contrary, if you are only searching for patterns and deleting/substituting them, you can always automate that via scripting.