LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Slackware (https://www.linuxquestions.org/questions/slackware-14/)
-   -   Nvidia Optimus, separate libglx.so for Intel card? (https://www.linuxquestions.org/questions/slackware-14/nvidia-optimus-separate-libglx-so-for-intel-card-922266/)

malloc 01-05-2012 03:03 PM

Nvidia Optimus, separate libglx.so for Intel card?
 
Finally I have Bumblebee working in 64-bit Slackware with an Nvidia Optimus system. However, there are still some issues, for starters, I have no GLX functionality on the Intel card. As such I'm forced to running everything with optirun on the Nvidia device.

From the Xorg log for the Intel card I have that it attempts to load /usr/lib64/xorg/modules/extensions/libglx.so, which identifies itself as the "NVIDIA GLX MODULE", obviously this is wrong, as this is the Intel device that is looking for a GLX module. This eventually fails as the module cannot find any NVIDIA card. No surprise here.

How can I configure my system such that I get GLX functionality with the Intel card as well? What libglx.so file should I look for, and how do I direct Xorg to use a specific libglx.so and not the one under /usr/lib64/xorg/modules/extensions/libglx.so ?

adamk75 01-05-2012 07:02 PM

You can specify the ModulePath in your xorg.conf file to get Xorg to load a different glx module, I believe. You might even be able to get away with specifying the full path just for the glx module. Of course, you'd have to restart X each time you wanted to switch glx modules.

Adam

malloc 01-06-2012 12:25 PM

Quote:

Originally Posted by adamk75 (Post 4567437)
You can specify the ModulePath in your xorg.conf file to get Xorg to load a different glx module, I believe. You might even be able to get away with specifying the full path just for the glx module. Of course, you'd have to restart X each time you wanted to switch glx modules.

Adam

Thanks, that was useful. Although I'm still curious what glx module I should specify, is it possible that the Nvidia installer has overwritten some generic glx module that the Intel driver would want to use? What module should I specify?

adamk75 01-06-2012 06:57 PM

I'm not very familiar with the nvidia drivers these days, but yes, I do believe the nvidia driver completely overwrites the default Xorg glx module. You would want to restore that glx module to another directory, and copy over all the other Xorg modules. Then you can use the ModulePath option to specify which directory Xorg should load modules from.

Adam

wildwizard 01-06-2012 09:20 PM

The binary nvidia driver trashes quite a fair bit of the system so I doubt you could ever use it with another device at the same time.

Have you tried the open source nouveau drivers yet?

malloc 01-07-2012 04:22 AM

Quote:

Originally Posted by wildwizard (Post 4568500)
The binary nvidia driver trashes quite a fair bit of the system so I doubt you could ever use it with another device at the same time.

Have you tried the open source nouveau drivers yet?

I am already using the system with two devices at the same time -- hence my initial remark in this thread that my system is finally working (at least to some extent) with Bumblebee. Not that there are not numerous problems with it, everything except glxgears seems to have numerous issues, and glxgears itself runs so slow that it might just as well have been rendered entirely in software. Nevertheless, it is rendered on the Nvidia GPU while the rest of X runs on the Intel GPU.

As far as the Nouveau driver goes, initially I tried that, without Bumblebee, it did not work any better than the Nvidia driver works without Bumblebee. But in any case, I cannot use the Nouveau driver even if it works with Bumblebee -- Nouveau is simply too experimental for me in its current state. The whole point of having a separate GPU is to get some decent performance out of it, when Nouveau cannot (yet) deliver that it is simply completely useless; unless you are having fun with developing it or simply stand on principle and want something that is better than purely software based rendering.

malloc 01-07-2012 04:37 AM

Quote:

Originally Posted by adamk75 (Post 4568425)
I'm not very familiar with the nvidia drivers these days, but yes, I do believe the nvidia driver completely overwrites the default Xorg glx module. You would want to restore that glx module to another directory, and copy over all the other Xorg modules. Then you can use the ModulePath option to specify which directory Xorg should load modules from.

Adam

Thanks again, so now I only need to know where I can get the original GLX module.

I've searched my own system for *libglx* files, but they all seem to be the same. I've searched online for it, including visited slackbuilds, freshmeat and a few other sites. Now I'm thinking I should download the Intel Graphics driver and compile that. Does anyone know if the Intel driver requires a specific libglx or if it can use the generic one? Either way, what is the best/correct approach to get the libglx module that I need?

adamk75 01-07-2012 06:46 AM

The Intel driver uses the generic Xorg glx module. You'll also need to grab libGL.so.1 from Mesa and LD_PRELOAD that when you want to run programs on the intel GPU.

corp769 01-07-2012 06:53 AM

I'm running Fedora 15 on my Alienware M14x, and I installed mesa-libGL natively for my intel card, and used the contrib/fedora script to install the nvidia drivers. libGL.so is within /usr/lib64, whereas the libraries for my nvidia 555M are installed in a non-usual location. When I run bumblebee, it calls for the xorg-nvidia.conf, which has the module path for the libgl files. Have a look here:
Code:

Section "ServerLayout"
    Identifier "Layout0"
    Option "AutoAddDevices" "false"
EndSection

Section "Files"
    ModulePath "/usr/lib64/nvidia/xorg,/usr/lib64/xorg/modules,/usr/lib/nvidia/xorg,/usr/lib/xorg/modules"
EndSection

Section "Device"
    Identifier "Device1"
    Driver "nvidia"
    VendorName "NVIDIA Corporation"
    BusID "01:00:0"
    Option "NoLogo" "true"
    Option "UseEDID" "false"
    Option "ConnectedMonitor" "DFP"
EndSection

Where do you have your nvidia libraries and files located at?

malloc 01-07-2012 04:32 PM

Quote:

Originally Posted by corp769 (Post 4568721)
I'm running Fedora 15 on my Alienware M14x, and I installed mesa-libGL natively for my intel card, and used the contrib/fedora script to install the nvidia drivers. libGL.so is within /usr/lib64, whereas the libraries for my nvidia 555M are installed in a non-usual location. When I run bumblebee, it calls for the xorg-nvidia.conf, which has the module path for the libgl files. Have a look here:
Code:

Section "ServerLayout"
    Identifier "Layout0"
    Option "AutoAddDevices" "false"
EndSection

Section "Files"
    ModulePath "/usr/lib64/nvidia/xorg,/usr/lib64/xorg/modules,/usr/lib/nvidia/xorg,/usr/lib/xorg/modules"
EndSection

Section "Device"
    Identifier "Device1"
    Driver "nvidia"
    VendorName "NVIDIA Corporation"
    BusID "01:00:0"
    Option "NoLogo" "true"
    Option "UseEDID" "false"
    Option "ConnectedMonitor" "DFP"
EndSection

Where do you have your nvidia libraries and files located at?

This is what I have:

For libglx:

/usr/lib64/nvidia-bumblebee/xorg/libglx.so.290.10
/usr/lib64/nvidia-bumblebee/xorg/libglx.so
/usr/lib64/xorg/modules/extensions/libglx.so.290.10
/usr/lib64/xorg/modules/extensions/libglx.so
/usr/lib64/xorg/modules/extensions/libglx.la

For libGL:

/usr/lib64/libGL.so
/usr/lib64/libGL.so.290.10
/usr/lib64/VirtualGL/libGL.so
/usr/lib64/nvidia-bumblebee/libGL.so
/usr/lib64/nvidia-bumblebee/libGL.so.290.10
/usr/lib64/nvidia-bumblebee/libGL.so.1
/usr/lib64/libGL.so.1
/usr/lib/libGL.so
/usr/lib/libGL.so.290.10
/usr/lib/libGL.so.1

For both libglx and libGL the files under nvidia-bumblebee are the same as the files immediately under /usr/lib64. /usr/lib/libGL.so is just a symlink to /usr/lib/libGL.so.290.10 and likewise for libglx.

But again I'm curious where can I get the Xorg libglx module? What should I download? The same goes for libGL.so, what should I download to get it?

corp769 01-07-2012 04:37 PM

That may be the issue.... Have a look at the contrib install files in bumblebee for other distros.... I had to use it to install the nvidia drivers in a different location so the normal symlinks would not get destroyed.

T3slider 01-07-2012 04:50 PM

Reinstall the xorg-server and mesa packages and it will restore the original libGL stuff (which should work with the intel drivers). However, then you'll have to reinstall the nVidia drivers, making sure not to overwrite those files again. The SlackBuild for nvidia-driver at slackbuilds.org does some funky symlink stuff, but in your case since you want persistent dual-driver support you may wish to tweak that to install the nVidia libGL files in another directory entirely. Alternatively, you could install the driver via the SlackBuild and then manually move the new files to another folder and specify the location as described (though I'm not 100% sure if this will work -- I'm assuming since the proprietary drivers are binary-based they would be relocatable as well, so maybe it will), leaving the symlinks (created by the nvidia-driver package) pointing to the original Slackware-provided libGL stuff.

adamk75 01-07-2012 04:55 PM

libGL is from the mesa package and libglx is from the xorg-server package. For future reference, you can grep through the files in /var/log/packages/ to find out what files belong to what packages.

malloc 01-09-2012 09:33 AM

Thanks for the input.

Now I have the system configured with one instance of "libglx.so" and one "libglx.so.290.10", where the former is the original module from the 13.37 DVD and the latter is the proprietary module from the NVIDIA driver package.

From my Xorg log for the Intel device I have that it loads: /usr/lib64/xorg/modules/extensions/libglx.so, which is what I want.

This does not work. I get the following errors upon executing glxgears:

Code:

Xlib: extension "NV-GLX" missing on display ":0.0"
Xlib: extension "NV-GLX" missing on display ":0.0"

However, strangely enough glxgears does start, and it does render, at around 60 FPS, I'm assuming this is an entirely software rendered output? There is also a horrible associated lag, where it seems to render smoothly at a high FPS for perhaps a few fractions of a second, before there is a much longer pause. Perhaps this causes the FPS calculation to make it ~60 FPS as long as it renders very fast in-between these segments of no rendering at all? Despite that it is interesting, as it did not get any such rendering prior to reinstalling the GLX and libGL files.

Also of interest is that I did not get these errors prior to a reboot. Immediately subsequent to reinstalling the topical files I restarted X and attempted to run glxgears. It executed fine, with no errors, although the framerate was only a few hundred it was much more than what I got after the reboot with the associated errors, and with no such lag as described.

The execution of glxgears is done on the Intel card, not with the "optirun" command. Although optirun works as before with glxgears, no errors.

Edit: I find this very peculiar, even with Bumblebee stopped and disabled, I still get the errors about missing NV-GLX, NV-GLX is for the Nvidia GLX module, isn't it? Why do I get these errors when Nvidia is not involved?

adamk75 01-09-2012 09:43 AM

Quote:

Originally Posted by malloc (Post 4570205)
Thanks for the input.

Now I have the system configured with one instance of "libglx.so" and one "libglx.so.290.10", where the former is the original module from the 13.37 DVD and the latter is the proprietary module from the NVIDIA driver package.

From my Xorg log for the Intel device I have that it loads: /usr/lib64/xorg/modules/extensions/libglx.so, which is what I want.

This does not work. I get the following errors upon executing glxgears:

Code:

Xlib: extension "NV-GLX" missing on display ":0.0"
Xlib: extension "NV-GLX" missing on display ":0.0"

However, strangely enough glxgears does start, and it does render, at around 60 FPS, I'm assuming this is an entirely software rendered output? There is also a horrible associated lag, where it seems to render smoothly at a high FPS for perhaps a few fractions of a second, before there is a much longer pause. Perhaps this causes the FPS calculation to make it ~60 FPS as long as it renders very fast in-between these segments of no rendering at all? Despite that it is interesting, as it did not get any such rendering prior to reinstalling the GLX and libGL files.

Also of interest is that I did not get these errors prior to a reboot. Immediately subsequent to reinstalling the topical files I restarted X and attempted to run glxgears. It executed fine, with no errors, although the framerate was only a few hundred it was much more than what I got after the reboot with the associated errors, and with no such lag as described.

The execution of glxgears is done on the Intel card, not with the "optirun" command. Although optirun works as before with glxgears, no errors.

Edit: I find this very peculiar, even with Bumblebee stopped and disabled, I still get the errors about missing NV-GLX, NV-GLX is for the Nvidia GLX module, isn't it? Why do I get these errors when Nvidia is not involved?

nvidia is clearly involved. You are likely still using the nvidia libGL.so.1 file which expects the NV-GLX etension.

Adam


All times are GMT -5. The time now is 06:40 PM.