Geode chipset detects monitor wrong
Hi all!
I have some problems with X detecting monitors in a wrong way. On one thin client it works as it should, on another thin client it doesn't. 2 thin clients should have exactly the same hardware. Here follows the Xorg log for the failing client: http://pastebin.com/f7bc5bb8 And this is for the working client: http://pastebin.com/f692f14d8 Somewhere around rule 500 you can see (multiple times): Driver mode And in the working log (multiple times): Default mode I have no idea what those differences mean. The clients get booted from the network and use X 1.4.2 and geode driver 2.10.0. I also tried with 2.10.1 but with the same result. Problem is that we are using alot of monitor types. Any help is VERY appreciated. |
All times are GMT -5. The time now is 01:55 PM. |