LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Software (https://www.linuxquestions.org/questions/linux-software-2/)
-   -   Very poor pdf reading performance (https://www.linuxquestions.org/questions/linux-software-2/very-poor-pdf-reading-performance-726715/)

synss 05-18-2009 03:39 AM

Very poor pdf reading performance
 
I make presentations using LaTeX Beamer. Thanks to xrandr, I should be able to show them on the overhead projector using my Linux laptop. However, pages with lots of small objects (exported from CorelDraw on Windows to eps —and I have no choice regarding CorelDraw, then converted to pdf and included into the presentation within tikzpicture or includegraphics) need a very long time to load. I tried different pdf readers, and for all of them, the CPU jumps to 100% and the program need to long to display the page, if it displays it at all. I tried the same presentation on OS X and Windows XP and there is no problem there, I even tried acrobat reader on Linux but it does not perform better than, e.g., xpf and epdfview.

Is there a reason/solution? Apart from re-exporting to png with a transparent background?

i92guboj 05-18-2009 05:05 AM

I recommend to try okular.

I don't use kde, but it's well worth to install kdelibs just for this application. Performance-wise speaking, it's far far ahead of epdfview, from my experience anyway.

H_TeXMeX_H 05-18-2009 05:31 AM

One solution might be convert it to djvu. I usually have to do that for large pdfs with lots of images because it's just too slow. I believe I put a post on here on how to do that using djvudigital. That's the only way I know for most cases.

jschiwal 05-18-2009 05:45 AM

Could you explain this process more:
"then converted to pdf and included into the presentation within tikzpicture or includegraphics"
Includegraphics I understand, but I haven't heard of tikzpicture. If you are starting with eps graphics, why convert them to pdfs?

Could the problem be that the objects are large and scaling down them down (in includegraphics) is taking time? You might try scaling them with "convert" and then include the smaller versions.

H_TeXMeX_H 05-18-2009 10:18 AM

I've also noticed that performance depends a lot on the software that is used to generate the pdf or eps. If you can export to dvi it would be better, then use 'dvipdf' to convert to pdf. Or just use different software to export it.

synss 05-19-2009 01:07 AM

Thank you all, I will try okular and report back. I already "convert" the images and they are 1:1 scale in the presentation. The Beamer class only generates pdf as far as I know, and pdflatex does not know about eps. TiKZ is from the same author and I use it to create small animations or as a replacement to overpic. I am a physical chemist. I generate the presentations with XepdfLaTeX.

djvu might be a solution, I will look for your post and see whether it suits me.

synss 05-19-2009 02:05 AM

Okular does not cut it either. I am desperate enough to have tried SumatraPDF into wine (at least, it is open source) and it kind of works better than anything native :'(
But when it needs a bit of time to render a pages, it switches to a white background asking to "please wait". I guess I will go on trying the windows side of things. How unfortunate!

Okular is faster than the other linux-viewers, though. But not quiet fast enough for a presentation.

EDIT: And foxit reader, although not free, is the best performer natively on Linux, BUT there are rendering glitches.

i92guboj 05-19-2009 02:22 AM

How big is the pdf file?

Maybe the bottleneck is not in the rendering (at least not entirely), do you see much disk activity while it loads?

synss 05-19-2009 02:33 PM

Quote:

Originally Posted by i92guboj (Post 3545477)
How big is the pdf file?

It is 8.2MB, I did not try to put the file in tmpfs but I do not think this would help, it is not that big. My laptop has a very slow HDD but over 1GB RAM, though. And most of it is unused.

I tried acroread, again, and it performs decently. Well, it needs some time to load but rendering is useably fast and without glitches.

i92guboj 05-19-2009 03:19 PM

A pdf of that size should render ok, however it all depends on the contents I guess.

H_TeXMeX_H 05-19-2009 03:40 PM

For example I have a pdf that I have gotten from a teacher, I've optimized it using pdfopt and it opens in 22.684s real time. The original opens in 22.724s real time. Converting it to djvu it opens in 0.740s, and has the same quality. Not to mention that the original pdf took about 22 seconds to display EACH PAGE !!! INSANE !!! This is probably the worst pdf ever made ! And it's only 9207 kB or about 9 MB. (the djvu version scrolls through pages instantly, basically less than 1 sec to display any one page, it also has a nice fullscreen mode for presentations and such ...) Also, the djvu is 6 MB.

osor 05-19-2009 04:01 PM

Postscript is a wonderful language, but as any language, it can be abused (e.g., you can write infinite loops in post script). Improperly or inefficiently written eps files may result in an interpreter which follows all instructions to the letter to be slow, while one which has workarounds for various known quirks to be fast.

I am not aware if djvu has the hyperlinking capability which beamer presentations use. You might be better off rasterizing whatever files you have before including them in your presentation (you will move the “time hog” to creating the images rather than rendering them during a presentation).

synss 05-20-2009 01:10 AM

Quote:

Originally Posted by osor (Post 3546221)
Postscript is a wonderful language, but as any language, it can be abused (e.g., you can write infinite loops in post script). Improperly or inefficiently written eps files may result in an interpreter which follows all instructions to the letter to be slow, while one which has workarounds for various known quirks to be fast.

Yes, I think CorelDraw does a fairly poor job at exporting to eps and exporting to pdf does not work too well either. I was thinking of exporting to png with a transparent background, I guess it would render faster. Thank you.

synss 05-26-2009 02:28 PM

Thank you all, I ended up rasterizing the whole presentation, which may actually be a good idea. I'll post my script here, it uses convert from imagemagick suite. The resulting file is another pdf file containing the rasterized version of every of the slides in the original file. I use png as intermediate format. Change the first convert call if there are more than 99 slides.

Code:

#!/bin/sh

[ $# -ne 1 ] && exit 1
[ ! -e $1 ] && exit 2

INFILE="$1"
OUTFILE="${INFILE%%.*}-raster.${INFILE#*.}"
OLDFILE="${INFILE%%.*}-raster-old.${INFILE#*.}"


echo "Rasterize $INFILE, results in $OUTFILE."

CONVERT=$(which convert)

if [ -d Slides ]; then
        rm -f Slides/*
else
        mkdir Slides
fi

if [ -e "$OUTFILE" ]; then
        mv "$OUTFILE" "$OLDFILE"
        echo "Previous version backed up ($OLDFILE)"
fi

$CONVERT -quality 100 -density 200x200 -trim +repage -resize !800x600 "$INFILE" Slides/raster%02d.png
$CONVERT -adjoin Slides/raster*.png "$OUTFILE"

rm -f Slides


i92guboj 05-26-2009 03:02 PM

Thanks for the feedback. I am sure that this will come in handy at some point. :)

*bookmarking*


All times are GMT -5. The time now is 06:09 AM.