It seems to be working well for me. Stock kernel, Nvidia GeForce 7150M.
One thing I do notice is that the output of glxinfo has changed and it no longer lists much of the previous information such as direct rendering and such. Not sure what's up with that. Code:
bash-3.1$ glxinfo Code:
5271 frames in 5.000 seconds = 1054.200 FPS |
Quote:
Also tried these test xorg files but downgraded pixman. I thought that helped at first but then a little later, the xine menus became non-responsive and I had to ctrl-alt-backspace to get out. I think for the hd3200 radeon people, the files in current work better (at least for 64 bit). I have not tried install fglrx (but doesn't compile/install without patching with the 2.6.29 and .30 kernels anyway) over these test files nor I have tried these on the 32 bit. Maybe others can comment. I wonder if these changes are more for intel graphic chips? |
Ok, test results:
Version: Slackware-current Box: MSI Wind U100 w/ Intel GMA 950 Kernel: 2.6.29.6-smp Using intel-2.7.1 Default packages; tested with glxgears w/o UXA: 67 FPS, compositing working. w UXA: Glxgears not working at all. With testing packages, intel-2.7.1; w/o UXA: 67 FPS, compositing working. w UXA: 105-109 FPS, compositing working. With testing packages, intel-2.8.0; I was able to get about 10 more FPS with UXA, it floated around 115-120. |
Here on my MSI Netbook with 945GME X worked best in -current around January/February . It went really worse with the major X upgrade somewhen in spring. glxgears showed around 600 FPS and even google earth was working good in the beginning before those upgrades.
These lateste upgrades from Robby now give me the best performance since then. It was better without upgrading the intel driver than stock -current, but it seems to be best now with intel driver 2.8. Nevertheless, X is still too slow for google earth. glxgears is around 110 fps. |
Slack64-C
AMD 7750 GeForce 7300 / x86_64-185.18.14 Stock Kernel GoogleEarth works perfect here. I can't find anything that does not work. glxgears 17203 frames in 5.010 seconds = 3433.732 FPS 17738 frames in 5.000 seconds = 3547.600 FPS 18778 frames in 5.000 seconds = 3755.600 FPS 18865 frames in 5.000 seconds = 3773.000 FPS 17889 frames in 5.014 seconds = 3567.810 FPS 17931 frames in 5.000 seconds = 3586.200 FPS |
FWIW, glxgears output is worthless as a benchmark number. That's always been the case, but it's even moreso now.
|
Oh, and glxinfo output is very different now. The mesa demos (of which glxinfo is one) were reworked completely -- some were dropped, some were added, and some were rewritten -- all of them now use libGLEW, which is why glew had to be added. What used to be "glxinfo" is no longer present -- instead, there's something called "glinfo" which may or may not be intended as the replacement, but at the time I was building all of this, that wasn't important, so I just installed glinfo as glxinfo in the package.
|
So, should glinfo still show direct rendering? I doesn't for me.
|
Sorry about that. Please disregard.
|
I have a refurbished HP desktop with integrated graphics:
Code:
# lspci Regards. Bill |
2 Attachment(s)
My specs:
Intel 855GM chipset kernel 2.6.29.6-smp i686 xf86-video-intel-2.5.1-i486-1_rlw NOTE: I'm using the intel-2.5.1 driver because I ran into problems with the intel-2.7.1 and the other intel drivers. I haven't tried the intel 2.8.0 yet. Anyway, I installed glew and upgraded everything else except for the intel driver. So far so good, I haven't encountered any problems yet. Xorg log still has the same warnings and errors. Code:
(WW) The directory "/usr/share/fonts/local" does not exist. |
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
|
Intel 2.7.1
Results: I tried this driver and I still get those screen icon artifacts. The screen artifacts also appear while watching online flash videos. This driver uses UXA by default and DRI + DRI2 are loaded by default too. Here are the warnings from 2.7.1, much of it is the same except that the following was removed: Code:
(WW) intel(0): ESR is 0x00000010, page table error These two uses UXA acceleration by default. DRI + DRI2 are loaded by default. Results: Same as intel 2.7.1. NOTE: The xorg warnings/errors for this one is the same as 2.7.1 but does not contain the warning about "[DRI2] Version 1 API (broken front buffer rendering)." Intel 2.6.3 DRI + DRI2 loaded by default Results: UXA (enabled) - full screen artifact scramble UXA (disabled) - full screen artifact scramble NOTE: The xorg log is the same except that it says, "DRI2 requires UXA" Intel 2.8.0 Results: Same as 2.7.1. NOTE: The xorg warnings/errors for 2.8.0 is the same as 2.7.1 but without the "[DRI2] Version 1 API (broken front buffer rendering)." So far Intel 2.5.1 has been running stable with no visual artifacts or performance issues on XFCE. I'll just stick with this one for now. I'll wait for further updates. |
Quote:
Code:
195 #http://wiki.archlinux.org/index.php/Xorg |
tried at work with ati 3450HD (radeon) and at home with nvidia 8600 GT SLI (nvidia 190.18): all ok :cool:
|
All times are GMT -5. The time now is 11:48 AM. |