Linux - HardwareThis forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I've been goin at it for about 4 hours now to no avail. I have a 2 LCD monitors I want to hook up and span my desktop with. One is 19" Princeton @ 1600x1200, and the other is 15" Dell and I want it to stick at 1024x768.
I'm using a single GeForce FX 6800 with 2 x DVI. 8178 driver.
I've read up on many different ways of doing this, but, problem is it seems peoples' xorg.conf examples are all too tailored to their own system to be of any real explanation value to me as to what exactly I need to do.
and....here is a xorg.conf that works on some dude's laptop (found this on a website), just so you have an idea of what 'working' should look like on at least one example out there:
My question is, what do I need to do to get mine working.
with everything I've tried (including the above), both monitors lose sync completely until I kill X, then they're back to console mirroring each other. ALL xorg.conf's have worked when the #2 monitor is unplugged from the video-card directly. But when it's plugged in (even when the #2 monitor has NO power to it---unplugged even) EVERY xorg.conf results in both monitors losing sync while I hear X loading up normally behind the blank screen.
I would like it to just be spanning the desktop across to the smaller monitor so I can put GAIM and/or my Terminals over there out of the way of my main screen.
I would plug in my other video card and avoid some of the complication, problem is, the motherboard only have PCI-E
I've been goin at it for about 4 hours now to no avail. I have a 2 LCD monitors I want to hook up and span my desktop with. One is 19" Princeton @ 1600x1200, and the other is 15" Dell and I want it to stick at 1024x768.
I'm using a single GeForce FX 6800 with 2 x DVI. 8178 driver.
I've read up on many different ways of doing this, but, problem is it seems peoples' xorg.conf examples are all too tailored to their own system to be of any real explanation value to me as to what exactly I need to do.
Once this is working then try to enable the fancy effects, the options go where I have them in the Section "Device ".
Edit Glad the other guy posted I left out that this file is for two separate screen you move between to use the spanning the desktops you need identical resolutions and you must remove the # from the [/i]# Option "Xinerama" "on"[/i] then put one "#" in front of the Load "xtrap" in the modules section. He may have a point with the twinview I have never used it but I am still thinking you are going to need the resolutions the same for both screens.
i've got a few ideas (also using AMD64, nVidia dual-head, Xorg)...
first of all, can you post /var/log/Xorg.log (it might be called Xorg.0.log) - all if it if you're not sure what it means, or you can cut out all the resource ranges and other irrelevant stuff.
next, can you get EITHER monitor to work on its own? if you've got the 19" working alone, and the 15" working alone, you're a bit closer to getting them to work together.
i found the readme section in the nvidia driver source surprisingly helpful - full of examples and explanations. i found it here: /usr/share/doc/nvidia-glx-1.0.6629-r6/README.gz
now about your xorg.conf -
- your monitor sections look right to me, but
- you only need one device (because you only have one video card)
- and you only need one screen in your own serverlayout (because nvidia's "twinview" option can handle the dual-head thing on it's own, so it doesn't need xinerama).
so those bits of my xorg.conf look like this:
Code:
Section "Device"
Identifier "gigabyte GV-N52128DE rev1.1 nVidia GeForce FX 5200"
Driver "nvidia"
# Driver "nv"
Option "NvAGP" "2"
Option "HWCursor" "true"
Option "RenderAccel" "true"
Option "TwinView" "true"
Option "TwinViewOrientation" "CRT-0 LeftOf CRT-1"
Option "HorizSync" "CRT-0: 30-85; CRT-1: 30-85"
Option "VertRefresh" "CRT-0: 56-160; CRT-1: 60-120"
Option "MetaModes" "1280x1024, 1280x1024"
Option "AllowGLXWithComposite" "true"
EndSection
Section "ServerLayout"
Identifier "Simple Layout"
## nVidia just wants one screen specified, with the second one described in the device section
Screen "Screen 2"
InputDevice "Mouse1" "CorePointer"
InputDevice "Keyboard1" "CoreKeyboard"
EndSection
So for the nvidia driver, a lot of the stuff which used to be in other sections of xorg.conf are now tucked into the "Device" section, which means the card handles the screen resolutions and the dual-head configuration, while the X server just thinks you've got one very wide screen. get your head around that, and everything else will make more sense.
each monitor on its own, yes, i can get working easily.
i noticed also that i used the wrong BusID. I'm going to try tweaking that from the original, and go through and try each of these things suggested and report back ;-p
Edit: I typically don't need to use all the extra fluff in the xorg.conf, but i just added it all in just to be complete and thorough when trying out other examples I've found online. For some reason my stuff like transparency just works w/o any tweaking, whereas some of my friends need to specify it...
(II) Initializing built-in extension MIT-SHM
(II) Initializing built-in extension XInputExtension
(II) Initializing built-in extension XTEST
(II) Initializing built-in extension XKEYBOARD
(II) Initializing built-in extension LBX
(II) Initializing built-in extension XC-APPGROUP
(II) Initializing built-in extension SECURITY
(II) Initializing built-in extension XINERAMA
(II) Initializing built-in extension XFIXES
(II) Initializing built-in extension XFree86-Bigfont
(II) Initializing built-in extension RENDER
(II) Initializing built-in extension RANDR
(II) Initializing built-in extension COMPOSITE
(II) Initializing built-in extension DAMAGE
(II) Initializing built-in extension XEVIE
(II) Initializing extension GLX
(**) Option "Protocol" "IMPS/2"
(**) Mouse0: Device: "/dev/input/mice"
(**) Mouse0: Protocol: "IMPS/2"
(**) Option "CorePointer"
(**) Mouse0: Core Pointer
(**) Option "Device" "/dev/input/mice"
(**) Option "Emulate3Buttons" "yes"
(**) Mouse0: Emulate3Buttons, Emulate3Timeout: 50
(**) Option "ZAxisMapping" "4 5"
(**) Mouse0: ZAxisMapping: buttons 4 and 5
(**) Mouse0: Buttons: 5
(**) Option "CoreKeyboard"
(**) Keyboard0: Core Keyboard
(**) Option "Protocol" "standard"
(**) Keyboard0: Protocol: standard
(**) Option "AutoRepeat" "500 30"
(**) Option "XkbRules" "xorg"
(**) Keyboard0: XkbRules: "xorg"
(**) Option "XkbModel" "pc105"
(**) Keyboard0: XkbModel: "pc105"
(**) Option "XkbLayout" "us"
(**) Keyboard0: XkbLayout: "us"
(**) Option "CustomKeycodes" "off"
(**) Keyboard0: CustomKeycodes disabled
(II) XINPUT: Adding extended input device "Keyboard0" (type: KEYBOARD)
(II) XINPUT: Adding extended input device "Mouse0" (type: MOUSE)
(II) XINPUT: Adding extended input device "NVIDIA Event Handler" (type: Other)
(II) Mouse0: ps2EnableDataReporting: succeeded
(II) NVIDIA(0): Setting mode "1280x1024"
AUDIT: Tue Feb 28 02:46:20 2006: 14188 X: client 19 rejected from local host
Quote:
HappyTux
I tried that config, with and without the changes you specified afterward, still same result.
I also tried turning on the Option"ConnectedMonitor" "DFP-0"
still nothing., I'm gonna mix'n'match some twinview options....
I figured it should work, but it didn't.
I'm thinking I'm getting no results because monitor #2 is hooked up with and analog cable w/ a DVI adapter @ the video-card end. I'm gonna look around here for another DVI cable and probably prove this to be a simple mistake, then again maybe not... we'll see.
yep, HappyTux and i have offered the two possible paths.
those xorg logs suggest you don't have twinview turned on. where you have this:
Code:
(II) Setting vga for screen 0.
(**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
(==) NVIDIA(0): RGB weight 888
(==) NVIDIA(0): Default visual is TrueColor
(==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
(--) NVIDIA(0): Linear framebuffer at 0xD0000000
(--) NVIDIA(0): MMIO registers at 0xFA000000
(II) NVIDIA(0): NVIDIA GPU detected as: GeForce 6800 XT
(--) NVIDIA(0): VideoBIOS: 05.41.02.48.05
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(II) NVIDIA(0): Detected PCI Express Link width: 16X
(--) NVIDIA(0): VideoRAM: 262144 kBytes
(II) NVIDIA(0): Connected display device(s): DFP-1
(--) NVIDIA(0): DFP-1: maximum pixel clock: 155 MHz
(WW) NVIDIA(0): Failure reading FlatPanel flags for DFP-1.
(II) NVIDIA(0): Frequency information for DFP-1:
(II) NVIDIA(0): HorizSync : 31.000-83.000 kHz
(II) NVIDIA(0): VertRefresh : 56.000-75.000 Hz
(II) NVIDIA(0): (HorizSync from EDID)
(II) NVIDIA(0): (VertRefresh from EDID)
(II) NVIDIA(0): Adding EDID-provided mode "1280x1024" for DFP-1.
(II) NVIDIA(0): Monitor0: Using hsync range of 31.00-83.00 kHz
(II) NVIDIA(0): Monitor0: Using vrefresh range of 56.00-75.00 Hz
(II) NVIDIA(0): Clock range: 12.00 to 155.00 MHz
i have this
Code:
(II) Setting vga for screen 0.
(**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
(==) NVIDIA(0): RGB weight 888
(==) NVIDIA(0): Default visual is TrueColor
(==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
(**) NVIDIA(0): Option "HWcursor" "true"
(**) NVIDIA(0): Option "NvAGP" "2"
(**) NVIDIA(0): Option "RenderAccel" "true"
(**) NVIDIA(0): Option "TwinView" "true"
(**) NVIDIA(0): Option "TwinViewOrientation" "CRT-0 LeftOf CRT-1"
(**) NVIDIA(0): Option "MetaModes" "1280x1024, 1280x1024"
(**) NVIDIA(0): Option "HorizSync" "CRT-0: 30-85; CRT-1: 30-85"
(**) NVIDIA(0): Option "VertRefresh" "CRT-0: 56-160; CRT-1: 60-120"
(**) NVIDIA(0): Option "AllowGLXWithComposite" "true"
(**) NVIDIA(0): Enabling experimental RENDER acceleration
(**) NVIDIA(0): Use of AGPGART requested
(**) NVIDIA(0): TwinView enabled
(--) NVIDIA(0): Linear framebuffer at 0xE8000000
(--) NVIDIA(0): MMIO registers at 0xF2000000
(II) NVIDIA(0): NVIDIA GPU detected as: GeForce FX 5200
(--) NVIDIA(0): VideoBIOS: 04.34.20.56.00
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(--) NVIDIA(0): VideoRAM: 131072 kBytes
(II) NVIDIA(0): Connected display device(s): CRT-0, CRT-1
(--) NVIDIA(0): Display device CRT-0: maximum pixel clock at 8 bpp: 400 MHz
(--) NVIDIA(0): Display device CRT-0: maximum pixel clock at 16 bpp: 400 MHz
(--) NVIDIA(0): Display device CRT-0: maximum pixel clock at 32 bpp: 400 MHz
(--) NVIDIA(0): Display device CRT-1: maximum pixel clock at 8 bpp: 400 MHz
(--) NVIDIA(0): Display device CRT-1: maximum pixel clock at 16 bpp: 400 MHz
(--) NVIDIA(0): Display device CRT-1: maximum pixel clock at 32 bpp: 400 MHz
(II) Loading sub module "ddc"
(II) LoadModule: "ddc"
(II) Loading /usr/lib64/modules/libddc.a
(II) Module ddc: vendor="X.Org Foundation"
compiled for 6.8.2, module version = 1.0.0
ABI class: X.Org Video Driver, version 0.7
(WW) NVIDIA(0): Failure reading EDID parameters for display device CRT-0
(WW) NVIDIA(0): Failure reading EDID parameters for display device CRT-1
(II) NVIDIA(0): Viewsonic VA702b: Using hsync range of 30.00-85.00 kHz
(II) NVIDIA(0): Viewsonic VA702b: Using vrefresh range of 56.00-160.00 Hz
(II) NVIDIA(0): Clock range: 12.00 to 400.00 MHz
mine works perfectly, even though one of the "CRT"s is really a DFP, and despite those EDID warnings.
the main difference, though, is that i've got NVIDIA(0) with CRT-0 and CRT-1, while your NVIDIA(0) only sees DFP-1. the second monitor isn't discovered anywhere in that log.
if you do it the old way (happytux's way) you'll have NVIDIA(0) and NVIDIA(1) with one monitor each, and xinerama enabled.
my way, you will have NVIDIA(0) with two monitors attached, twinview on and xinerama off.
the nvidia docs i pointed you to last post have a lot of stuff about your options running both screens at different resolutions. when i had different sized screens i ran them both at the same resolution, which means i had really small fonts and stuff on the second monitor and my mouse jumped when it ran between them, but i didn't have the dead cursor below the second monitor. that's for later though...
any more luck?
next time, just post the bit of xorg.0.log before the PCI scan (about 50 lines), and the bit starting "Setting vga for screen 0" until before all the lines like "(II) NVIDIA(0): Not using default mode "1280x960" (hsync out of range)" (about 50 lines again). most of the answers are there...
[EDIT]
i've got an analog cable with a DVI adapter as well (i was going to ask you where you got that 15" DVI monitor), it works for me.
and i spent a while mucking around with the "twinviewOrientation" thing, and i ended up deleting "secondMonitorHorizSync" and "SecondMonitorVertSync" in favour of Option "HorizSync" "CRT-0: 30-85; CRT-1: 30-85" and Option "VertRefresh" "CRT-0: 56-160; CRT-1: 60-120"
Last edited by andrewlorien; 03-01-2006 at 02:02 AM.
i'm now using 2 x DVI cables, and my last posted xorg.conf actually starts X with 1 display working, with both plugged in. The smaller secondary display is the one showing at the moment...but the main one is the one I configged.. lol
Code:
(==) Log file: "/var/log/Xorg.0.log", Time: Tue Feb 28 23:56:12 2006
(==) Using config file: "/root/xorg.conf"
(==) ServerLayout "single head configuration"
(**) |-->Screen "Screen0" (0)
(**) | |-->Monitor "Monitor0"
(**) | |-->Device "Videocard0"
(**) |-->Input Device "Mouse0"
(**) |-->Input Device "Keyboard0"
(**) FontPath set to "unix/:7100"
(**) RgbPath set to "/usr/X11R6/lib/X11/rgb"
(==) ModulePath set to "/usr/X11R6/lib/modules"
(WW) Open APM failed (/dev/apm_bios) (No such file or directory)
(II) Module ABI versions:
X.Org ANSI C Emulation: 0.2
X.Org Video Driver: 0.7
X.Org XInput driver : 0.4
X.Org Server Extension : 0.2
X.Org Font Renderer : 0.4
(II) Loader running on linux
(II) LoadModule: "bitmap"
(II) Loading /usr/X11R6/lib/modules/fonts/libbitmap.a
(II) Module bitmap: vendor="X.Org Foundation"
compiled for 6.8.2, module version = 1.0.0
Module class: X.Org Font Renderer
(II) Loading font Bitmap
(II) LoadModule: "pcidata"
(II) Loading /usr/X11R6/lib/modules/libpcidata.a
(II) Module pcidata: vendor="X.Org Foundation"
compiled for 6.8.2, module version = 1.0.0
ABI class: X.Org Video Driver, version 0.7
(--) using VT number 7
...
(II) Setting vga for screen 0.
(**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
(==) NVIDIA(0): RGB weight 888
(==) NVIDIA(0): Default visual is TrueColor
(==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
(--) NVIDIA(0): Linear framebuffer at 0xD0000000
(--) NVIDIA(0): MMIO registers at 0xFA000000
(II) NVIDIA(0): NVIDIA GPU detected as: GeForce 6800 XT
(--) NVIDIA(0): VideoBIOS: 05.41.02.48.05
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(II) NVIDIA(0): Detected PCI Express Link width: 16X
(--) NVIDIA(0): VideoRAM: 262144 kBytes
(II) NVIDIA(0): Connected display device(s): DFP-0, DFP-1
(WW) NVIDIA(0): Multiple displays connected, but only one display allowed;
(WW) NVIDIA(0): using first display
(--) NVIDIA(0): DFP-0: maximum pixel clock: 155 MHz
(--) NVIDIA(0): DFP-0: Internal Single Link TMDS
(II) NVIDIA(0): Frequency information for DFP-0:
(II) NVIDIA(0): HorizSync : 30.000-61.000 kHz
(II) NVIDIA(0): VertRefresh : 56.000-76.000 Hz
(II) NVIDIA(0): (HorizSync from EDID)
(II) NVIDIA(0): (VertRefresh from EDID)
(II) NVIDIA(0): Adding EDID-provided mode "1024x768" for DFP-0.
(II) NVIDIA(0): Monitor0: Using hsync range of 30.00-61.00 kHz
(II) NVIDIA(0): Monitor0: Using vrefresh range of 56.00-76.00 Hz
(II) NVIDIA(0): Clock range: 12.00 to 155.00 MHz
....
(II) NVIDIA(0): Virtual screen size determined to be 1024 x 768
(--) NVIDIA(0): DPI set to (86, 84); computed from "UseEdidDpi" X config option
(II) Loading sub module "fb"
(II) LoadModule: "fb"
(II) Loading /usr/X11R6/lib/modules/libfb.a
(II) Module fb: vendor="X.Org Foundation"
compiled for 6.8.2, module version = 1.0.0
ABI class: X.Org ANSI C Emulation, version 0.2
(II) Loading sub module "ramdac"
(II) LoadModule: "ramdac"
(II) Loading /usr/X11R6/lib/modules/libramdac.a
(II) Module ramdac: vendor="X.Org Foundation"
compiled for 6.8.2, module version = 0.1.0
still testing to get main screen working.
>>> andrewlorien
what do you suggest I do to this file here? With this, the secondary monitor comes on w/ correct resolution, main monitor flickers and then goes out of sync.
... I'm throwing things out hoping someone will jump in with a possible solution because I'm mainly taking shots in the dark since there are so many possibl variations. making progress though.
(II) NVIDIA X Driver 1.0-8178 Wed Dec 14 16:25:22 PST 2005
(II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
(II) Primary Device is: PCI 05:00:0
(--) Assigning device section with no busID to primary device
(--) Chipset NVIDIA GPU found
(II) resource ranges after xf86ClaimFixedResources() call:
..........
(II) Setting vga for screen 0.
(**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
(==) NVIDIA(0): RGB weight 888
(==) NVIDIA(0): Default visual is TrueColor
(==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
(--) NVIDIA(0): Linear framebuffer at 0xD0000000
(--) NVIDIA(0): MMIO registers at 0xFA000000
(II) NVIDIA(0): NVIDIA GPU detected as: GeForce 6800 XT
(--) NVIDIA(0): VideoBIOS: 05.41.02.48.05
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(II) NVIDIA(0): Detected PCI Express Link width: 16X
(--) NVIDIA(0): VideoRAM: 262144 kBytes
(II) NVIDIA(0): Connected display device(s): DFP-0, DFP-1
(WW) NVIDIA(0): Multiple displays connected, but only one display allowed;
(WW) NVIDIA(0): using first display
(--) NVIDIA(0): DFP-0: maximum pixel clock: 155 MHz
(--) NVIDIA(0): DFP-0: Internal Single Link TMDS
(II) NVIDIA(0): Frequency information for DFP-0:
(II) NVIDIA(0): HorizSync : 30.000-61.000 kHz
(II) NVIDIA(0): VertRefresh : 56.000-76.000 Hz
(II) NVIDIA(0): (HorizSync from EDID)
(II) NVIDIA(0): (VertRefresh from EDID)
(II) NVIDIA(0): Adding EDID-provided mode "1024x768" for DFP-0.
(II) NVIDIA(0): Monitor0: Using hsync range of 30.00-61.00 kHz
(II) NVIDIA(0): Monitor0: Using vrefresh range of 56.00-76.00 Hz
(II) NVIDIA(0): Clock range: 12.00 to 155.00 MHz
(II) NVIDIA(0): Not using default mode "640x350" (vrefresh out of range)
...........
(II) NVIDIA(0): Virtual screen size determined to be 1024 x 768
(--) NVIDIA(0): DPI set to (86, 84); computed from "UseEdidDpi" X config option
(II) Loading sub module "fb"
(II) LoadModule: "fb"
(II) Loading /usr/X11R6/lib/modules/libfb.a
(II) Module fb: vendor="X.Org Foundation"
compiled for 6.8.2, module version = 1.0.0
ABI class: X.Org ANSI C Emulation, version 0.2
(II) Loading sub module "ramdac"
(II) LoadModule: "ramdac"
(II) Loading /usr/X11R6/lib/modules/libramdac.a
(II) Module ramdac: vendor="X.Org Foundation"
compiled for 6.8.2, module version = 0.1.0
ABI class: X.Org Video Driver, version 0.7
(--) Depth 24 pixmap format is 32 bpp
(II) do I need RAC? No, I don't.
(II) resource ranges after preInit:
........
(II) NVIDIA(0): Setting mode "1024x768"
(II) Loading extension NV-GLX
(II) NVIDIA(0): NVIDIA 3D Acceleration Architecture Initialized
(II) NVIDIA(0): Using the NVIDIA 2D acceleration architecture
(==) NVIDIA(0): Backing store disabled
(==) NVIDIA(0): Silken mouse enabled
(**) Option "dpms"
(**) NVIDIA(0): DPMS enabled
(II) Loading extension NV-CONTROL
(==) RandR enabled
(II) Initializing built-in extension MIT-SHM
(II) Initializing built-in extension XInputExtension
(II) Initializing built-in extension XTEST
(II) Initializing built-in extension XKEYBOARD
(II) Initializing built-in extension LBX
(II) Initializing built-in extension XC-APPGROUP
(II) Initializing built-in extension SECURITY
(II) Initializing built-in extension XINERAMA
(II) Initializing built-in extension XFIXES
(II) Initializing built-in extension XFree86-Bigfont
(II) Initializing built-in extension RENDER
(II) Initializing built-in extension RANDR
(II) Initializing built-in extension COMPOSITE
(II) Initializing built-in extension DAMAGE
(II) Initializing built-in extension XEVIE
(II) Initializing extension GLX
AUDIT: Wed Mar 1 00:47:30 2006: 2865 X: client 20 rejected from local host
I was reading up a bit, and someone else with the same problem was told to "Try changing:
Virtual 1600 1200
to:
Virtual 3200 1200"
this makes sense since i assume X will think it's one long display, but how would i implement this >?
hmm.
i've never met "only one display allowed"
i want to suggest changing the "SecondMonitor" options to the other numbers (or just switching the cables around), but i'm sure you've tried it and i'm sure it won't work.
i found that post you obviously did...
here's an idea:
it seems to me that modern video cards are pretty smart. when you read the xorg log it seems to know all on it's own how many monitors you have plugged in, what their refresh rates are, it tries all the modes whether you specify them or not. and the nvidia guy in that forum suggested
(II) NVIDIA X Driver 1.0-8178 Wed Dec 14 16:25:22 PST 2005
(II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
(II) Primary Device is: PCI 05:00:0
(--) Assigning device section with no busID to primary device
(--) Chipset NVIDIA GPU found
(II) resource ranges after xf86ClaimFixedResources() call:
(II) Setting vga for screen 0.
(**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
(==) NVIDIA(0): RGB weight 888
(==) NVIDIA(0): Default visual is TrueColor
(==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
(--) NVIDIA(0): Linear framebuffer at 0xD0000000
(--) NVIDIA(0): MMIO registers at 0xFA000000
(II) NVIDIA(0): NVIDIA GPU detected as: GeForce 6800 XT
(--) NVIDIA(0): VideoBIOS: 05.41.02.48.05
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(II) NVIDIA(0): Detected PCI Express Link width: 16X
(--) NVIDIA(0): VideoRAM: 262144 kBytes
(II) NVIDIA(0): Connected display device(s): DFP-0, DFP-1
(WW) NVIDIA(0): Multiple displays connected, but only one display allowed;
(WW) NVIDIA(0): using first display
(--) NVIDIA(0): DFP-0: maximum pixel clock: 155 MHz
(--) NVIDIA(0): DFP-0: Internal Single Link TMDS
(II) NVIDIA(0): Frequency information for DFP-0:
(II) NVIDIA(0): HorizSync : 30.000-61.000 kHz
(II) NVIDIA(0): VertRefresh : 56.000-76.000 Hz
(II) NVIDIA(0): (HorizSync from EDID)
(II) NVIDIA(0): (VertRefresh from EDID)
(II) NVIDIA(0): Adding EDID-provided mode "1024x768" for DFP-0.
(II) NVIDIA(0): Monitor0: Using hsync range of 30.00-61.00 kHz
(II) NVIDIA(0): Monitor0: Using vrefresh range of 56.00-76.00 Hz
(II) NVIDIA(0): Clock range: 12.00 to 155.00 MHz
(II) NVIDIA(0): Not using default mode "640x350" (vrefresh out of range)
(II) NVIDIA(0): Not using default mode "320x175" (vrefresh out of range)
(II) NVIDIA(0): Not using default mode "640x400" (vrefresh out of range)
(II) NVIDIA(0): Not using default mode "320x200" (vrefresh out of range)
(II) NVIDIA(0): Not using default mode "720x400" (vrefresh out of range)
(II) NVIDIA(0): Not using default mode "360x200" (vrefresh out of range)
(II) NVIDIA(0): Not using default mode "640x480" (vrefresh out of range)
(II) NVIDIA(0): Not using default mode "320x240" (vrefresh out of range)
(II) NVIDIA(0): Not using default mode "800x600" (vrefresh out of range)
(II) NVIDIA(0): Not using default mode "400x300" (vrefresh out of range)
(II) NVIDIA(0): Not using default mode "1024x768" (vrefresh out of range)
(II) NVIDIA(0): Not using default mode "512x384" (vrefresh out of range)
(II) NVIDIA(0): Not using default mode "1024x768" (hsync out of range)
(II) NVIDIA(0): Not using default mode "512x384" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1152x864" (hsync out of range)
(II) NVIDIA(0): Not using default mode "576x432" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1280x960" (hsync out of range)
(II) NVIDIA(0): Not using default mode "640x480" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1280x1024" (hsync out of range)
(II) NVIDIA(0): Not using default mode "640x512" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1280x1024" (hsync out of range)
(II) NVIDIA(0): Not using default mode "640x512" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1280x1024" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "640x512" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1600x1200" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "800x600" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1600x1200" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "800x600" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1600x1200" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "800x600" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1600x1200" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "800x600" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1600x1200" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "800x600" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1792x1344" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "896x672" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1792x1344" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "896x672" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1856x1392" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "928x696" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1856x1392" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "928x696" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1920x1440" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "960x720" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1920x1440" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "960x720" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1152x768" (vrefresh out of range)
(II) NVIDIA(0): Not using default mode "576x384" (vrefresh out of range)
(II) NVIDIA(0): Not using default mode "1400x1050" (hsync out of range)
(II) NVIDIA(0): Not using default mode "700x525" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1400x1050" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "700x525" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1600x1024" (hsync out of range)
(II) NVIDIA(0): Not using default mode "800x512" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1920x1440" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "960x720" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "2048x1536" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "1024x768" (hsync out of range)
(II) NVIDIA(0): Not using default mode "2048x1536" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "1024x768" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "2048x1536" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "1024x768" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "848x480" (vrefresh out of range)
(II) NVIDIA(0): Not using default mode "424x240" (vrefresh out of range)
(II) NVIDIA(0): Not using default mode "1152x864" (hsync out of range)
(II) NVIDIA(0): Not using default mode "576x432" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1152x864" (hsync out of range)
(II) NVIDIA(0): Not using default mode "576x432" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1152x864" (hsync out of range)
(II) NVIDIA(0): Not using default mode "576x432" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1400x1050" (hsync out of range)
(II) NVIDIA(0): Not using default mode "700x525" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1400x1050" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "700x525" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1280x800" (hsync out of range)
(II) NVIDIA(0): Not using default mode "640x400" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1280x800" (hsync out of range)
(II) NVIDIA(0): Not using default mode "640x400" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1680x1050" (hsync out of range)
(II) NVIDIA(0): Not using default mode "840x525" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1680x1050" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "840x525" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1680x1050" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "840x525" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1680x1050" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "840x525" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1280x720" (hsync out of range)
(II) NVIDIA(0): Not using default mode "640x360" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1280x768" (hsync out of range)
(II) NVIDIA(0): Not using default mode "640x384" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1920x1200" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "960x600" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1920x1200" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "960x600" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1920x1200" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "960x600" (hsync out of range)
(II) NVIDIA(0): Not using default mode "1920x1200" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "960x600" (hsync out of range)
(II) NVIDIA(0): Not using default mode "2560x1600" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "1280x800" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "2560x1600" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "1280x800" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "2560x1600" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "1280x800" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "2560x1600" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using default mode "1280x800" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using mode "1600x1200" (no mode of this name)
(II) NVIDIA(0): Not using mode "1400x1050" (no mode of this name)
(II) NVIDIA(0): Not using mode "1280x1024" (no mode of this name)
(WW) NVIDIA(0): Not using mode "1280x960" (width 1280 is larger than
(WW) NVIDIA(0): EDID-specified maximum 1024)
(WW) NVIDIA(0): Not using mode "1280x800" (width 1280 is larger than
(WW) NVIDIA(0): EDID-specified maximum 1024)
(WW) NVIDIA(0): Not using mode "1152x864" (width 1152 is larger than
(WW) NVIDIA(0): EDID-specified maximum 1024)
(WW) NVIDIA(0): Not using mode "1280x800" (width 1280 is larger than
(WW) NVIDIA(0): EDID-specified maximum 1024)
(WW) NVIDIA(0): Not using mode "1280x768" (width 1280 is larger than
(WW) NVIDIA(0): EDID-specified maximum 1024)
(WW) NVIDIA(0): Not using mode "1280x768" (width 1280 is larger than
(WW) NVIDIA(0): EDID-specified maximum 1024)
(WW) NVIDIA(0): Not using mode "1280x768" (width 1280 is larger than
(WW) NVIDIA(0): EDID-specified maximum 1024)
(WW) NVIDIA(0): Not using mode "1280x720" (width 1280 is larger than
(WW) NVIDIA(0): EDID-specified maximum 1024)
(WW) NVIDIA(0): Not using mode "1280x720" (width 1280 is larger than
(WW) NVIDIA(0): EDID-specified maximum 1024)
(WW) NVIDIA(0): Not using mode "1280x720" (width 1280 is larger than
(WW) NVIDIA(0): EDID-specified maximum 1024)
(WW) NVIDIA(0): Not using mode "640x480" (height 960 is larger than
(WW) NVIDIA(0): EDID-specified maximum 768)
(WW) NVIDIA(0): Not using mode "640x400" (height 800 is larger than
(WW) NVIDIA(0): EDID-specified maximum 768)
(WW) NVIDIA(0): Not using mode "640x400" (height 800 is larger than
(WW) NVIDIA(0): EDID-specified maximum 768)
(WW) NVIDIA(0): Not using mode "576x432" (height 864 is larger than
(WW) NVIDIA(0): EDID-specified maximum 768)
(**) NVIDIA(0): Validated modes for display device DFP-0:
(**) NVIDIA(0): Default mode "1024x768": 78.8 MHz, 60.1 kHz, 75.1 Hz
(**) NVIDIA(0): Default mode "800x600": 49.5 MHz, 46.9 kHz, 75.0 Hz
(**) NVIDIA(0): Default mode "640x480": 31.5 MHz, 37.5 kHz, 75.0 Hz
(**) NVIDIA(0): Default mode "1024x768": 75.0 MHz, 56.5 kHz, 70.1 Hz
(**) NVIDIA(0): Default mode "1024x768": 65.0 MHz, 48.4 kHz, 60.0 Hz
(**) NVIDIA(0): Mode "1024x768": 65.0 MHz, 48.4 kHz, 60.0 Hz
(**) NVIDIA(0): Default mode "832x624": 57.3 MHz, 49.7 kHz, 74.6 Hz
(**) NVIDIA(0): Default mode "800x600": 50.0 MHz, 48.1 kHz, 72.2 Hz
(**) NVIDIA(0): Default mode "800x600": 40.0 MHz, 37.9 kHz, 60.3 Hz
(**) NVIDIA(0): Default mode "800x600": 36.0 MHz, 35.2 kHz, 56.2 Hz
(**) NVIDIA(0): Default mode "848x480": 41.0 MHz, 37.6 kHz, 75.0 Hz
(**) NVIDIA(0): Default mode "848x480": 37.5 MHz, 35.0 kHz, 70.0 Hz
(**) NVIDIA(0): Default mode "848x480": 31.5 MHz, 29.8 kHz, 60.0 Hz
(**) NVIDIA(0): Default mode "640x480": 31.5 MHz, 37.9 kHz, 72.8 Hz
(**) NVIDIA(0): Default mode "640x480": 25.2 MHz, 31.5 kHz, 60.0 Hz
(**) NVIDIA(0): Default mode "640x384": 51.5 MHz, 60.2 kHz, 75.0 Hz (D)
(**) NVIDIA(0): Default mode "640x384": 47.5 MHz, 56.0 kHz, 70.0 Hz (D)
(**) NVIDIA(0): Default mode "640x384": 40.1 MHz, 47.7 kHz, 60.1 Hz (D)
(**) NVIDIA(0): Default mode "640x360": 47.8 MHz, 56.4 kHz, 75.0 Hz (D)
(**) NVIDIA(0): Default mode "640x360": 44.5 MHz, 52.5 kHz, 70.0 Hz (D)
(**) NVIDIA(0): Default mode "640x360": 37.2 MHz, 44.8 kHz, 60.0 Hz (D)
(**) NVIDIA(0): Default mode "512x384": 39.4 MHz, 60.1 kHz, 75.1 Hz (D)
(**) NVIDIA(0): Default mode "512x384": 37.5 MHz, 56.5 kHz, 70.1 Hz (D)
(**) NVIDIA(0): Default mode "512x384": 32.5 MHz, 48.4 kHz, 60.0 Hz (D)
(**) NVIDIA(0): Default mode "416x312": 28.6 MHz, 49.7 kHz, 74.7 Hz (D)
(**) NVIDIA(0): Default mode "400x300": 24.8 MHz, 46.9 kHz, 75.1 Hz (D)
(**) NVIDIA(0): Default mode "400x300": 25.0 MHz, 48.1 kHz, 72.2 Hz (D)
(**) NVIDIA(0): Default mode "400x300": 20.0 MHz, 37.9 kHz, 60.3 Hz (D)
(**) NVIDIA(0): Default mode "400x300": 18.0 MHz, 35.2 kHz, 56.3 Hz (D)
(**) NVIDIA(0): Default mode "424x240": 20.5 MHz, 37.6 kHz, 75.0 Hz (D)
(**) NVIDIA(0): Default mode "424x240": 18.8 MHz, 35.0 kHz, 70.0 Hz (D)
(**) NVIDIA(0): Default mode "424x240": 15.7 MHz, 29.8 kHz, 60.1 Hz (D)
(**) NVIDIA(0): Default mode "320x240": 15.8 MHz, 37.5 kHz, 75.0 Hz (D)
(**) NVIDIA(0): Default mode "320x240": 15.8 MHz, 37.9 kHz, 72.8 Hz (D)
(**) NVIDIA(0): Default mode "320x240": 12.6 MHz, 31.5 kHz, 60.1 Hz (D)
(II) NVIDIA(0): Virtual screen size determined to be 1024 x 768
(--) NVIDIA(0): DPI set to (86, 84); computed from "UseEdidDpi" X config option
(II) Loading sub module "fb"
(II) LoadModule: "fb"
(II) Loading /usr/X11R6/lib/modules/libfb.a
(II) Module fb: vendor="X.Org Foundation"
compiled for 6.8.2, module version = 1.0.0
ABI class: X.Org ANSI C Emulation, version 0.2
(II) Loading sub module "ramdac"
(II) LoadModule: "ramdac"
(II) Loading /usr/X11R6/lib/modules/libramdac.a
(II) Module ramdac: vendor="X.Org Foundation"
compiled for 6.8.2, module version = 0.1.0
ABI class: X.Org Video Driver, version 0.7
(--) Depth 24 pixmap format is 32 bpp
(II) do I need RAC? No, I don't.
(II) resource ranges after preInit:
(II) NVIDIA(0): Setting mode "1024x768"
(II) Loading extension NV-GLX
(II) NVIDIA(0): NVIDIA 3D Acceleration Architecture Initialized
(II) NVIDIA(0): Using the NVIDIA 2D acceleration architecture
(==) NVIDIA(0): Backing store disabled
(==) NVIDIA(0): Silken mouse enabled
(**) Option "dpms"
(**) NVIDIA(0): DPMS enabled
(II) Loading extension NV-CONTROL
(==) RandR enabled
(II) Initializing built-in extension MIT-SHM
(II) Initializing built-in extension XInputExtension
(II) Initializing built-in extension XTEST
(II) Initializing built-in extension XKEYBOARD
(II) Initializing built-in extension LBX
(II) Initializing built-in extension XC-APPGROUP
(II) Initializing built-in extension SECURITY
(II) Initializing built-in extension XINERAMA
(II) Initializing built-in extension XFIXES
(II) Initializing built-in extension XFree86-Bigfont
(II) Initializing built-in extension RENDER
(II) Initializing built-in extension RANDR
(II) Initializing built-in extension COMPOSITE
(II) Initializing built-in extension DAMAGE
(II) Initializing built-in extension XEVIE
(II) Initializing extension GLX
AUDIT: Wed Mar 1 10:38:38 2006: 7705 X: client 21 rejected from local host
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.