LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Slackware (https://www.linuxquestions.org/questions/slackware-14/)
-   -   Nvidia proprietary driver (https://www.linuxquestions.org/questions/slackware-14/nvidia-proprietary-driver-4175550289/)

ReaperX7 08-10-2015 03:57 PM

Just one last question, but... Can you post your /etc/X11/xorg.conf file?

Often utilities like glxgears and such aren't coded with primusrun and optirun in mind, and because the Nvidia card is technically a detached 3D device, similar to an old 3Dfx Voodoo2 1000 or Quantum3D Obsidian2 X24, if you remember these, NV-GLX is not going to register as the main display.

Glad you got it working however.

If your applications work, then that might be a better test.

Youmu 08-10-2015 04:32 PM

Quote:

Originally Posted by ReaperX7 (Post 5404032)
Just one last question, but... Can you post your /etc/X11/xorg.conf file?

Often utilities like glxgears and such aren't coded with primusrun and optirun in mind, and because the Nvidia card is technically a detached 3D device, similar to an old 3Dfx Voodoo2 1000 or Quantum3D Obsidian2 X24, if you remember these, NV-GLX is not going to register as the main display.

Glad you got it working however.

If your applications work, then that might be a better test.

I am glad that my videocards now are working silent and cool, but there are some questions that remains opened.
As I see, rebuilding Xord fixed issues with logind (maybe we should report it to Slackware's bugtracker (even though I haven't seen one)).
Even if glxinfo outputs NO errors, glxgears and glxspheres won't work. Even with primus or optirun. Steam says something about unsupported direct rendering.
===>>>primusrun glxgears:
Code:

Xlib:  extension "NV-GLX" missing on display ":8".
Segmentation fault

Without primusrun glxgears just opens black screen.
-----
Oh, and I haven't got any xorg.conf file. I've heard that it is outdated to use it.
ls /etc/x11:
Code:

./  ../  WindowMaker/  app-defaults/  fs/  mwm/  seyon/  x3270/  xdm/  xinit/  xkb/  xorg.conf-vesa  xorg.conf.d/  xorg.conf.nvidia-xconfig-original  xsm/
Thanks again, I am really happy that you guys are helping me out.

ReaperX7 08-10-2015 07:19 PM

The xorg.conf is still needed unfortunately with cards. Because you run a dual card system, and the fact you have permission problems, an xorg.conf makes diagnosis easier. It's far from outdated. Nvidia actually recommends you use one. With Optimus graphics you have to have the Intel VGA set up correctly.

Nothing in UNIX is ever truly outdated. It's just aged to perfection. ;)

You should have the Intel graphics showing up as the primary VGA and under non-primusrun/optirun glxinfo, it should natively spit out the Intel driver info. If it's not, then the autoconfiguration decided to take a trip on the short bus and used the VESA/Modesetting driver probably and with Optimus, that's a no-no.

Slackware should still have the xorgsetup ncurses-based utility. Just run it, and post the output from /etc/X11/xorg.conf and then we can start piecing the last parts of the puzzle together.

Youmu 08-11-2015 06:03 AM

My xorg.conf:
Code:

Section "ServerLayout"
        Identifier    "X.org Configured"
        Screen      0  "Screen0" 0 0
        InputDevice    "Mouse0" "CorePointer"
        InputDevice    "Keyboard0" "CoreKeyboard"
EndSection

Section "Files"
        ModulePath  "/usr/lib64/xorg/modules"
        FontPath    "/usr/share/fonts/local"
        FontPath    "/usr/share/fonts/TTF"
        FontPath    "/usr/share/fonts/OTF"
        FontPath    "/usr/share/fonts/Type1"
        FontPath    "/usr/share/fonts/misc"
        FontPath    "/usr/share/fonts/CID"
        FontPath    "/usr/share/fonts/75dpi/:unscaled"
        FontPath    "/usr/share/fonts/100dpi/:unscaled"
        FontPath    "/usr/share/fonts/75dpi"
        FontPath    "/usr/share/fonts/100dpi"
        FontPath    "/usr/share/fonts/cyrillic"
EndSection

Section "Module"
        Load  "glx"
EndSection

Section "InputDevice"
        Identifier  "Keyboard0"
        Driver      "kbd"
EndSection

Section "InputDevice"
        Identifier  "Mouse0"
        Driver      "mouse"
        Option            "Protocol" "auto"
        Option            "Device" "/dev/input/mice"
        Option            "ZAxisMapping" "4 5 6 7"
EndSection

Section "Monitor"
        Identifier  "Monitor0"
        VendorName  "Monitor Vendor"
        ModelName    "Monitor Model"
EndSection

Section "Device"
        ### Available Driver options are:-
        ### Values: <i>: integer, <f>: float, <bool>: "True"/"False",
        ### <string>: "String", <freq>: "<f> Hz/kHz/MHz",
        ### <percent>: "<f>%"
        ### [arg]: arg optional
        #Option    "NoAccel"                    # [<bool>]
        #Option    "AccelMethod"                # <str>
        #Option    "Backlight"                  # <str>
        #Option    "DRI"                        # <str>
        #Option    "Present"                    # [<bool>]
        #Option    "ColorKey"                  # <i>
        #Option    "VideoKey"                  # <i>
        #Option    "Tiling"                    # [<bool>]
        #Option    "LinearFramebuffer"          # [<bool>]
        #Option    "VSync"                      # [<bool>]
        #Option    "PageFlip"                  # [<bool>]
        #Option    "SwapbuffersWait"            # [<bool>]
        #Option    "TripleBuffer"              # [<bool>]
        #Option    "XvPreferOverlay"            # [<bool>]
        #Option    "HotPlug"                    # [<bool>]
        #Option    "ReprobeOutputs"            # [<bool>]
        #Option    "DeleteUnusedDP12Displays"        # [<bool>]
        #Option    "XvMC"                      # [<bool>]
        #Option    "ZaphodHeads"                # <str>
        #Option    "VirtualHeads"              # <i>
        #Option    "TearFree"                  # [<bool>]
        #Option    "PerCrtcPixmaps"            # [<bool>]
        #Option    "FallbackDebug"              # [<bool>]
        #Option    "DebugFlushBatches"          # [<bool>]
        #Option    "DebugFlushCaches"          # [<bool>]
        #Option    "DebugWait"                  # [<bool>]
        #Option    "BufferCache"                # [<bool>]
        Identifier  "Card0"
        Driver      "intel"
        BusID      "PCI:0:2:0"
EndSection

Section "Screen"
        Identifier "Screen0"
        Device    "Card0"
        Monitor    "Monitor0"
        DefaultDepth 24
        SubSection "Display"
                Viewport  0 0
                Depth    1
        EndSubSection
        SubSection "Display"
                Viewport  0 0
                Depth    4
        EndSubSection
        SubSection "Display"
                Viewport  0 0
                Depth    8
        EndSubSection
        SubSection "Display"
                Viewport  0 0
                Depth    15
        EndSubSection
        SubSection "Display"
                Viewport  0 0
                Depth    16
        EndSubSection
        SubSection "Display"
                Viewport  0 0
                Depth    24
        EndSubSection
EndSection

Maybe it is possible to add another device under appropriate section?

ReaperX7 08-11-2015 07:12 AM

No. This is fine as is. You don't need to configure the Nvidia part. Primus and Bumblebee will handle that for you.

Now what it should be doing is this...

When you run optirun or primusrun with the application you want ran on the Nvidia chip, it should run it without an issue, but remember it's a detached 3D device, not a native display adapter.

Try using optirun or primusrun on an OpenGL using application and it should work now.

Youmu 08-11-2015 11:10 AM

Quote:

Try using optirun or primusrun on an OpenGL using application and it should work now.
Unfortunately, I don't get it. glxgears and spheres are applications that are using GL mechanism. Without primusrun I can see only black screen as a result.
primusrun steam:
Quote:

Couldn't find dpkg, please update steamdeps for your distribution.
Running Steam on slackware 14.1 64-bit
STEAM_RUNTIME is enabled automatically
Installing breakpad exception handler for appid(steam)/version(1437790054)
Xlib: extension "NV-GLX" missing on display ":8".
assert_20150811190857_1.dmp[7161]: Uploading dump (out-of-process)
/tmp/dumps/assert_20150811190857_1.dmp
/home/sapka/.local/share/Steam/steam.sh: line 756: 7139 Segmentation fault $STEAM_DEBUGGER "$STEAMROOT/$STEAMEXEPATH" "$@"
primusrun glxspheres64:
Quote:

Polygons in scene: 62464
Visual ID of window: 0x20
Xlib: extension "NV-GLX" missing on display ":8".
Context is Indirect
Segmentation fault

ReaperX7 08-11-2015 06:39 PM

When you installed this did you follow this guide directly?

http://docs.slackware.com/howtos:har...nvidia_optimus

bgeer 12-07-2015 01:30 PM

Sort of related...I got tired of having to muck with nvidia blob every kernel update on my "ancient" sys with onboard geforce 6150 nv4e graphics. Nouveau was sufficient for a long time for my needs & mostly still is. However, sometime over the last few months I noticed that when changing from one virtual desktop to another, on my 2-monitor system, there is a noticeable lag. Maybe a total of 1 second or so - not terrible but disconcerting. Question is, has anyone else noticed this kind of change? Just curious...much obliged, Bob

1337_powerslacker 12-09-2015 10:03 AM

Quote:

Originally Posted by enorbet (Post 5404025)
Example - nvidia's installer searches for conflicting libraries and installs GLX and Mesa that it knows works with that version driver and hardware. It is not known to me if the SBs behave in this manner. I've been installing nvidia's driver this way for over 10 years and excepting choosing newer drivers on old hardware that would not install (and told me why and what to get) I have never had any problems with nvidia drivers and I don't have to wait for someone to build a SlackBuild nor create one myself for nvidia anyway.

I have a handsome chunk of change coming my way in the near future, and have decided to take advantage of that to upgrade my graphics hardware. Now, I have run Radeon hardware for a good long while, but the last Nvidia hardware I remember owning was a GeForce 4 MX. The details are now lost to me, but I don't specifically remember a showstopper error in the installation and/or configuration of Nvidia cards. I have decided to switch teams for the following reasons:
  • Numerous reviews of the EVGA GTX 970 (which I plan on running in SLI configuration) have stated that the power consumption is much lower than comparable Radeon hardware
  • It has the facilities for a VGA adapter (DVI-I -> VGA). The 390 I originally wanted to upgrade to has no such facilities. I don't have the funds to upgrade both monitors and video card, and have decided to prioritize, upgrading the video cards first. Nvidia's thoughtfuless in this area has earned it points with me.
  • The newest driver release supports xorg-server 1.18, which is in -current, and which AMD's driver does not yet support.

Although I have written elsewhere about how to make the newest drivers (Catalyst & Crimson Edition) work on -current, I think that it's time to give Nvidia a chance to prove itself. The reviews look good; users are mostly happy with the Nvidia cards they chose. So my question is: Is there any difficulties in the installation and/or configuration of the Nvidia driver I should be aware of? I would appreciate any insight and/or experience you have had. Thanks.

P.S. I have placed the order, and find myself excited about getting it in. For a man who doesn't game very much, this may be a ironic attitude. But now my rig is future-proofed, and besides, I've always wanted to try SLI since the days of the venerable Voodoo 2, and never have been able to afford it. Now, for the first time, I can, and the opportunity will never come again, not for a long time, I think.

Regards,

Matt

enorbet 12-09-2015 02:31 PM

@mattallmill - I don't know what else I can tell you beyond that I have found nVidia cards to be extremely well-supported and trouble-free, especially on alternative operating systems. My first GUI, excepting PCShell on DOS, was not Windows but rather IBMs OS/2 and even back then nVidia had full 3D Accelerated drivers for that OpSys. I started using Linux around 1999 and within a very short time nVidia had good drivers for it. ATi was at least 10 years late to the game. I support nVidia because they support me in my choices but also because they write really good drivers for just bout any OpSys you'd care to name. They've gotten quite good at it :).

The only "difficulties" I have ever in some 20+ years of faithful nVidia loyalty experienced was with monitors (lousy EDID, easily fixed with nVidia Options in xorg.conf) and one individual case where I went out on a limb and bought an Elsa OEM card that was WAY off spec and Elsa soon abandoned support even for Windows. It still worked with nVidia generic driver but just didn't get the full use of VRAM which was double the reference design. I made a risky gamble and lost but learned to keep my choices simpler.

With the cards you've ordered you should have an effortless, thoroughly enjoyable experience even if you don't game much. Your system will be very well balanced by virtue of the GPU(s) taking so much load off your CPU and having the power to not bog down. With so much content these days being multimedia, and let's face it humans are very heavily biased towards vision, it is indeed rare that one will ever regret spending cash on the graphics system. It's just worth it. :)

1337_powerslacker 12-09-2015 08:21 PM

Quote:

Originally Posted by enorbet (Post 5462047)
@mattallmill - I don't know what else I can tell you beyond that I have found nVidia cards to be extremely well-supported and trouble-free, especially on alternative operating systems. My first GUI, excepting PCShell on DOS, was not Windows but rather IBMs OS/2 and even back then nVidia had full 3D Accelerated drivers for that OpSys. I started using Linux around 1999 and within a very short time nVidia had good drivers for it. ATi was at least 10 years late to the game. I support nVidia because they support me in my choices but also because they write really good drivers for just bout any OpSys you'd care to name. They've gotten quite good at it :).

The only "difficulties" I have ever in some 20+ years of faithful nVidia loyalty experienced was with monitors (lousy EDID, easily fixed with nVidia Options in xorg.conf) and one individual case where I went out on a limb and bought an Elsa OEM card that was WAY off spec and Elsa soon abandoned support even for Windows. It still worked with nVidia generic driver but just didn't get the full use of VRAM which was double the reference design. I made a risky gamble and lost but learned to keep my choices simpler.

With the cards you've ordered you should have an effortless, thoroughly enjoyable experience even if you don't game much. Your system will be very well balanced by virtue of the GPU(s) taking so much load off your CPU and having the power to not bog down. With so much content these days being multimedia, and let's face it humans are very heavily biased towards vision, it is indeed rare that one will ever regret spending cash on the graphics system. It's just worth it. :)

Thanks for your input. I've been thinking a long time about this purchase, and I think a culmination of factors resulted in this purchase, not the least of which is the availability of funds, of course; I've been seeing a lot of people online complain about the quality of AMD's drivers, and although I have been a fan of AMD since the days of the 386DX-40, where AMD was kicking rear and taking names, and Intel's best fell short, with its 386DX-33 being its flagship product at the time (A time that will never come again, sadly), and I can vouch for the excellent price/performance of their FX-83x0 series of processors, it's time to acknowledge that sometimes a different company can do a certain job better. I don't think AMD is deliberately shorting their fans on their driver support, but simply don't have the manpower to devote to proper driver development, and now if my new cards perform up to expectations, according to what you've said and others around the web, I probably will be sticking with Nvidia from now on.

P.S. I got to thinking about your statement about how multimedia has pervaded our computing experience, it gives me extra assurance that I made the right choice. $350/card is the most I've *ever* spent on a graphics card, and to be honest, it made me kind of nervous, that I was wasting good money. It look like, though, that in the end, it will turn out to be money well-invested.

1337_powerslacker 12-09-2015 08:40 PM

Pairing first-class hardware with the first-class distro of a first-class OS - what could possibly be better? ;)

Richard Cranium 12-09-2015 11:35 PM

Quote:

Originally Posted by mattallmill (Post 5461938)
But now my rig is future-proofed, and besides, I've always wanted to try SLI since the days of the venerable Voodoo 2, and never have been able to afford it.

Semi-OT in that this is an SLI story:

I ran with two EVGA GeForce 9800GT in SLI mode for a few years with no problems at all. I would run on particular game (under Windows, I'm afraid; it's OK for games and not much else) and after about 5 minutes of intense combat, the machine would slow to an absolute crawl. After I was killed in the game, the machine would speed back up in a minute or so.

It turned out that the cooling fan on one of the cards had failed. Most of the time, it didn't matter. But when I stressed the pair with very high resolution graphics during a game, one card would overheat and shut down until it cooled a bit. The remaining card tried to handle all the work but couldn't keep up. :)

I've since upgraded to a single GeForce GT 640 that's able to do what the pair of GeForce 9800GT were able to do. I still eye the SLI cards; I may pick up a pair one of these days since my motherboards are all SLI capable.

1337_powerslacker 12-10-2015 08:39 AM

Quote:

Originally Posted by Richard Cranium (Post 5462253)
Semi-OT in that this is an SLI story:

I ran with two EVGA GeForce 9800GT in SLI mode for a few years with no problems at all. I would run on particular game (under Windows, I'm afraid; it's OK for games and not much else) and after about 5 minutes of intense combat, the machine would slow to an absolute crawl. After I was killed in the game, the machine would speed back up in a minute or so.

It turned out that the cooling fan on one of the cards had failed. Most of the time, it didn't matter. But when I stressed the pair with very high resolution graphics during a game, one card would overheat and shut down until it cooled a bit. The remaining card tried to handle all the work but couldn't keep up. :)

I've since upgraded to a single GeForce GT 640 that's able to do what the pair of GeForce 9800GT were able to do. I still eye the SLI cards; I may pick up a pair one of these days since my motherboards are all SLI capable.

I don't think the moderators will mind since I did mention SLI, and you are quoting me :D

I used to run Windows (in the days of XP) full-time, and much for the same reasons: gaming. Now, however, the games I like to run (Quake I w/ Darkplaces Engine & Epsilon add-on, Unreal single-player, and UT2004) run perfectly under Linux. The first game I mentioned stresses my R9 270 in certain point of the game, notably in the exit stages of certain levels, although the frame-rates do drop to sub-20 levels (my cutoff point for enjoyable gameplay) in other areas of the levels as well. Needless to say, I haven't run Windows in a long time (and have in fact eradicated it from my hard drives some time ago). I figure that with relatively powerful cards running in SLI mode, I have a bit of breathing room until the next compelling upgrade comes along (and being a poor college student, that will be a *long* time in the future; unfortuitous circumstances have made my life a living hell these past 3 months; fortunately, some good did occur, and the financial windfall was the result of that. Time to enjoy the perks while opportunity still knocks, methinks. :)

heyjann 12-10-2015 09:57 AM

Sorry to the the bearer of bad tidings, but before you unbox the second card - depending on the store's return policy - please type in things like nvidia sli linux broken in a search engine to find some posts by a fellow going by the nickname the_mard, he looked into it quite a bit.
In short, SLI support is quite poor at the moment in Linux except if the Doom 3 engine is being used. I have some benchmark results of my own, too, and disabling one card improves (indeed) performance quite a bit.
Support might improve in the future of course, but at the moment, it does not look hopeful.


All times are GMT -5. The time now is 03:54 AM.