open source radeon driver has higher precision image quality over closed source fglrx?
Couldn't help but notice a glaring difference in image quality between fglrx and open source radeon driver in the article here: http://www.phoronix.com/scan.php?pag...tem&px=MTA1NzI.
It looks like the open source driver uses higher precision with better image quality results. The fglrx image output looks pale and bright in comparison (looks washed out and bright; similar to too high gamma values).
You can tell on the picture with the purple guy; the blue on top of wall in background looks more white and pale in the fglrx driver, while it's noticeably blue and fuller on the open source driver (not to mention the purple guy doesn't look overly contrasted). The x1800xl images look the best overall in my opinion.
These pale looking differences are apparent in all those images, and I'm even more suprised to see these differences from my laptop's LCD since I wasn't using one of my CRT's (CRT's have full color production).
So has the open source driver always been slower due to higher precision and less shortcuts in rendering techniques? I'm even more curious with older cards that match fglrx performance right now (like r300 and maybe slightly newer generations); do they use higher precision while being just as fast, as the fglrx drivers?
Open source Radeon stuff goes through this cycle
Newest cards - not doing them properly yet.
newish but old cards - run well
old & crap cards - YMMV but it's all you get.
|All times are GMT -5. The time now is 01:36 PM.|