Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
This is a rather unimportant question, but I have just been wondering if anyone has been able to get the Twang screensaver to run smoothly with a decent framerate. The preview in xscreensaver runs like a dream since it's about an eighth the actual size of the fullscreen version. The fullscreen version runs from about 1-5 fps on my machine, even when I lower all settings to the lowest.
Glxgears gives me 3377.2 fps, which isn't the best out there, but not too bad. I know my machine's capable of running some nice games (including UT2K4, AAO, and others) with good framerates.
So, what kind of hardware would a guy need to run this screensaver? I think it's just a matter of how the screensaver is coded. Although I haven't looked at the source (I can't find it), it seems like all that's done is the screen is captured, split into a grid, and each grid box is 'plucked.' That shouldn't require massive amounts of video ram or other resources, afaik. The main calculations are probably trig functions, and the actual memory needed to store the screenshot shouldn't be enormous.
I appreciate any input on this. I'm mainly just curious.
Twang does not use GLX. It's entirely based on software rendering. I have a 1.8 Ghz AMD, and Quake 2 runs about ~20fps when I run it at 1152x864 using software rendering. Also, twang uses 32-bit color, isn't built for speed, etc. It would be trivial to reimplement it in GL.
Yeah, that's what I figured. I know nothing about GL right now, but I plan to learn a little, so perhaps when I am proficient enough, I'll reimplement it. Thanks for the input. I just had to confirm my suspicions about it.