Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
I have downloaded a large pdf file, about 85M, which initially takes a while to load, which is fine, but whenever i go to a new page the cpu shoots up to 100% and the page takes a few seconds to load. This makes viewing the file a chore since i need to skip around a lot.
I have tried a few different viewers, but they all act the same. Why do they take so long to load each page, and is there anything i can do to speed it up?
If a pdf is that large, it either has an absolutely humongous amount of text in it, or it's composed of images of pages rather than text. I'm going to guess the latter, as each image has to be rendered before it can be displayed.
The program also has to extract the images from the pdf container before it can render them, likely adding to the overhead. I've never really seen pdf as being a very efficient storage format in any case.
PDFs do vary enormously in quality, depending on how the text was scanned and what creation tool was used. I've had a document with less than a dozen pages that was incredibly slow. If you look at the scanned books in archive.org, you can also see cases where there are two versions of the same text, one 50% larger than the other.