LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 08-04-2017, 08:50 AM   #1
Xeratul
Senior Member
 
Registered: Jun 2006
Location: UNIX
Distribution: FreeBSD
Posts: 2,357

Rep: Reputation: 213Reputation: 213Reputation: 213
Terminal ultra fast file viewer for several GBs plain text file?


Hello,

I am looking for an ultra fast file viewer for several GBs plain text file? My relatively large file is about 5.5 Gb. Just text and text.
I would like to view it, with scrolling down and GOTO to jump to given line.

== VIM
I have used first VIM but it will load the load document into the memory.

== LESS
Less is quite good but slow too. Less is still useable since it does not load the (large) file into memory.

What else do we have for terminal?

Thank you
 
Old 08-04-2017, 09:20 AM   #2
MensaWater
LQ Guru
 
Registered: May 2005
Location: Atlanta Georgia USA
Distribution: Redhat (RHEL), CentOS, Fedora, CoreOS, Debian, FreeBSD, HP-UX, Solaris, SCO
Posts: 7,820
Blog Entries: 15

Rep: Reputation: 1664Reputation: 1664Reputation: 1664Reputation: 1664Reputation: 1664Reputation: 1664Reputation: 1664Reputation: 1664Reputation: 1664Reputation: 1664Reputation: 1664
The fastest scroll can be done with "cat" command. On tests just now on a 4.4MB file:
more (holding down space bar to end) 2m54.745s
less (holding down space bar to end) 2m57.617s
less (hitting ctrl-g to go to end=no scroll) 0m3.208s
cat (scrolls automatically from start to end) 0m1.133s

All of that is way to fast for humans to read. I'm assuming you don't want to just scroll but rather want to find text. For indeterminate text less is the way to go because it will let you scroll up and down from the text you find so you can see nearby lines whereas more only allows scroll down.

You're correct for files that large you should NOT use vim.

If you know the text you're looking for you could use grep to find the text out of the file. If you need multiple lines grep has flags that can give you additional lines after what you're querying. We used to have a process that would scan web/java logs looking for errors and would send any such line found followed by the next 30 lines (which would give more detail on the error) via email. This worked well as the log in question had a limited size and would quickly overwrite itself with continued errors so if we didn't extract and email we'd never know what the original error and detail was.

Last edited by MensaWater; 08-04-2017 at 01:53 PM.
 
Old 08-04-2017, 01:38 PM   #3
aragorn2101
Member
 
Registered: Dec 2012
Location: Mauritius
Distribution: Slackware
Posts: 552

Rep: Reputation: 271Reputation: 271Reputation: 271
5.5 GB ? Oh, I don't think you'll get better than less, even though it is slow. Most of the other applications will try to load the file into memory.

Just 2 things:

- if you know you are searching for something, you can use grep to look for the pattern and extract a certain number of lines around that matching line, and eventually you can redirect the output to another file, which will be smaller and much more manageable.
- you could split the file into several smaller parts. Let's think about it, you're not going to be able to read a 5.5 GB file in one go, so you don't exactly need to open the whole of it. You can use the sed command to extract specific lines or a range of lines from the file and again redirect it in order to constitute a smaller manageable file.

Some examples:
Code:
> grep -n -B 10 -A 5 "PATTERN" FILENAME > OUTPUTFILE
  This will output all matching lines along with the line number, 10 lines before and 5 lines after matching line.

> sed -n 20421,25000p FILENAME > OUTPUTFILE
  This will output a range of lines from line 20421 to line 25000 to the OUTPUTFILE.
Please read the man pages for grep and sed for much more information.
 
Old 08-04-2017, 01:44 PM   #4
273
LQ Addict
 
Registered: Dec 2011
Location: UK
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,456

Rep: Reputation: 2302Reputation: 2302Reputation: 2302Reputation: 2302Reputation: 2302Reputation: 2302Reputation: 2302Reputation: 2302Reputation: 2302Reputation: 2302Reputation: 2302
Since I'm not taking this off the zero-reply list may I ask whether there is a text editor which will just open and allow editing of files over, for example, 4GB? Or, indeed, any hex editors.
I am not sure of the reasons for the original question but I ahve to admit that not being able to open files larger than about 15% of my RAM size frustrates me.
 
Old 08-04-2017, 01:52 PM   #5
MensaWater
LQ Guru
 
Registered: May 2005
Location: Atlanta Georgia USA
Distribution: Redhat (RHEL), CentOS, Fedora, CoreOS, Debian, FreeBSD, HP-UX, Solaris, SCO
Posts: 7,820
Blog Entries: 15

Rep: Reputation: 1664Reputation: 1664Reputation: 1664Reputation: 1664Reputation: 1664Reputation: 1664Reputation: 1664Reputation: 1664Reputation: 1664Reputation: 1664Reputation: 1664
To edit a file that size you're probably better off using sed to edit the specific line(s) than trying to pull the entire thing into an editor all at once.

http://www.unixcl.com/2010/01/sed-sa...same-file.html
 
Old 08-04-2017, 02:02 PM   #6
273
LQ Addict
 
Registered: Dec 2011
Location: UK
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,456

Rep: Reputation: 2302Reputation: 2302Reputation: 2302Reputation: 2302Reputation: 2302Reputation: 2302Reputation: 2302Reputation: 2302Reputation: 2302Reputation: 2302Reputation: 2302
Quote:
Originally Posted by MensaWater View Post
To edit a file that size you're probably better off using sed to edit the specific line(s) than trying to pull the entire thing into an editor all at once.

http://www.unixcl.com/2010/01/sed-sa...same-file.html
Indeed, but to continue my off topic (?) further, for example, I wanted to try to retrieve some videos for a colleague and it appeared their headers were damaged. I would have loved to be able to open the files and compare them by eye -- i.e. looking not only for the same text but for patterns.
With 32GB of RAM it seems a bit silly that the biggest file I seem to be able to load into RAM and edit is around 4GB (or there abouts).
Again, apologies if this isn't the issue the original post is about but it seems relevant to me.
 
Old 08-04-2017, 03:44 PM   #7
IsaacKuo
Senior Member
 
Registered: Apr 2004
Location: Baton Rouge, Louisiana, USA
Distribution: Debian 9 Stretch
Posts: 2,351
Blog Entries: 8

Rep: Reputation: 384Reputation: 384Reputation: 384Reputation: 384
Quote:
Originally Posted by aragorn2101 View Post
- you could split the file into several smaller parts. Let's think about it, you're not going to be able to read a 5.5 GB file in one go, so you don't exactly need to open the whole of it. You can use the sed command to extract specific lines or a range of lines from the file and again redirect it in order to constitute a smaller manageable file.
There's a utility command that's good at splitting up text files called "split". Here's are two examples which will split a text file into chunks 10,000 lines long (the first uses long human readable flags):

Code:
split --numeric-suffixes --lines=10000 --suffix-length=6 MyBigFile.txt
split -d -l 3 -a 6 MyBigFile.txt
This will create files x000000, x000001, x000002, x000003, ...

Then, you can use the following command to view the split up files:
Code:
less x??????
You use :n and :p to go to the next (or previous) file.

Don't forget to clean up those files later on!
 
1 members found this post helpful.
Old 08-06-2017, 05:28 AM   #8
AwesomeMachine
LQ Guru
 
Registered: Jan 2005
Location: USA and Italy
Distribution: Debian testing/sid; OpenSuSE; Fedora; Mint
Posts: 5,511

Rep: Reputation: 1006Reputation: 1006Reputation: 1006Reputation: 1006Reputation: 1006Reputation: 1006Reputation: 1006Reputation: 1006
There doesn't seem to be a Linux editor capable of handling gigabytes-large files with any speed. Slickedit is a closed-source program that will run on Linux, edits files up to 2TB, but costs 300.00USD for one user.

X2 might also be a possibility. http://www.tangbu.com/x2main.shtml
 
Old 08-06-2017, 06:24 AM   #9
ondoho
LQ Addict
 
Registered: Dec 2013
Posts: 13,171
Blog Entries: 9

Rep: Reputation: 3605Reputation: 3605Reputation: 3605Reputation: 3605Reputation: 3605Reputation: 3605Reputation: 3605Reputation: 3605Reputation: 3605Reputation: 3605Reputation: 3605
the requirement is viewing, not editing.
with that in mind, maybe a shell is not necessary, and something like xmessage will actually be faster?
Code:
xmessage -file /really/large/file
(there was another x utility for displaying text, without interaction, but the name eludes me right now)
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] How to export just source/destination field of pcap file into a plain text file kikilinux Linux - Networking 5 11-27-2014 08:13 AM
[SOLVED] Redirecting terminal output to text file leads to corrupted text file mariovrn Linux - Newbie 5 11-30-2011 08:21 AM
Run commands found in plain text file splunk Linux - Software 6 02-02-2008 12:15 PM
.ram file shows as text/plain biosnacky Linux - Newbie 8 01-07-2007 09:21 PM
not a plain text file wazza4610 Linux - Newbie 1 11-22-2005 05:20 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 05:07 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration