LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Slackware (https://www.linuxquestions.org/questions/slackware-14/)
-   -   A Guide: Enabling 3D Acceleration in X11 (https://www.linuxquestions.org/questions/slackware-14/a-guide-enabling-3d-acceleration-in-x11-402003/)

cwwilson721 01-11-2006 07:55 AM

Get your 3D acceleration working UPDATED 11/30/09
 
Website is back down. Some off-site links WON'T work.

NEED HELP W/ATi. Make posts, please! ATi section is far out of date!


UPDATED: Now includes ATI, NVidia, VIA\S3 Savage, and Intel!!


You can also vote on this thread. Let me know how it is.

I have seen ALOT of questions concerning this issue.

After starting this thread, I have also noticed alot of people reading this. If you have ANY experience, good or bad, trying to enable DRI on your card/chip, please post them. All experiences can be learned from.

This is also a constantly changing post. Come back and read it again. I'm adding/removing/changing the post constantly, to try to reflect new information as it becomes revealed.

But first, a disclaimer:

READ YOUR DOCUMENTATION

I use Slackware 13 64bit, Multi-lib working. I do not know if any of the following will definitely work in any other distribution!
Also, be aware that this may not work on your hardware.

My experience with this issue is mostly with the i810 family of chipsets, and now extensively Nvidia.

But, as a general guide, it can be applied to most circumstances.

Check the various sections for your chipset/video card

But remember: Sometimes, because of BIOS limitations, or the card/chip itself, it just won't work

Links to external sites for various related topics:

This website: www.freedesktop.org/wiki/FrontPage has a ton of info on cards and standards.

***********************************************************

In General:

READ YOUR DOCUMENTATION


Direct Rendering (DRI) is dependant on many things, among them are kernel support, BIOS (both video and system), memory for the video (either onboard the card itself, or shared system memory), your xorg.conf, and the driver for your card/chip itself.

There are three things that are generally needed for 3D Acceleration (DRI) to take place:
  1. Correct drivers for your video card/chipset. In the case of Intel, most use the supplied i810 driver in X. It will work fine. Some ATI and almost ALL NVidia cards/chips need their own drivers. Go ahead and download them. DO NOT COMPILE/INSTALL YET. Unless you want to do it again later. (I prefer the compile/install once method. You might like to do it multiple times. It's your life. Do as you wish)
  2. Recompile the kernel. As long as you're at it, use the newest from www.kernel.org or the one on the second cd, whichever you want.
  3. Edit /etc/X11/xorg.conf
Those are the quickies. Now for details

Correct Drivers:

IF YOU WANT 3D ACCELERATION FOR NVIDIA/ATI,
YOU MUST USE THE PROPRIETARY DRIVERS.

The open source/kernel drivers DO NOT SUPPORT 3D

Download them. Borrow from a friend. Steal them from your roommate/brother/sister/that weird guy from down the street.
As I said, with the Intel card/chip, the ones installed w/X11 are fine (i810). ATI (if needed)and all NVidia, get them.

Edit /etc/X11/xorg.conf

Now come the variables. There are three general things that must be done (Your hardware driver may not allow these,check your documentation):
  1. Load the DRI driver. Make sure this is showing:
    Code:

          # This loads the GLX module
        Load      "glx"
    # This loads the DRI module
        Load      "dri"

    *** Some higher end cards do not want/will not work with DRI module loaded. READ YOUR DOCUMENTATION
  2. Memory. Must have enough. in the "Graphics device section", find your card. Here is mine as an example:
    Code:

          Section "Device"
        Identifier  "Intel 810"
        Driver      "i810"
        VideoRam    65536
        Option    "XVideo"        "On"
        Option    "MonitorLayout"        "LFP,CRT"
        Option    "DevicePresence"    "On"   
        # Insert Clocks lines here if appropriate
    EndSection

    And:
    Code:

    Section "Screen"
        Identifier  "LCD"
        Device      "Intel 810"
        Monitor    "My Monitor"
        DefaultDepth 24

        Subsection "Display"
            Depth      8
            Modes      "1024x768" "800x600"
            ViewPort    0 0
        EndSubsection
        Subsection "Display"
            Depth      16
            Modes      "1024x768" "800x600"
            ViewPort    0 0
        EndSubsection
        Subsection "Display"
            Depth      24
            Modes      "1024x768" "800x600"
            ViewPort    0 0
        EndSubsection
    EndSection

    *** Note: these are from MY xorg.conf. I do not know if they will work with your hardware.
    Also, the "LFP,CRT" is for my laptop, and the i810 driver. You may have to change it to "CRT,CRT" for yours to work. Check the man i810 pages.
    ***

  3. At the end of the file, look for this,and add it if you don't have it:
    Code:

          Section "DRI"
        Mode 0666
     EndSection

    You must have the Mode 0666 if you want anybody else to have DRI but root.
There maybe other tweaks that are needed, depending on your configuration and needs.

Checking if DRI is working:

Restart your X session, (restart it, reboot, whatever). When X is up, get a terminal window open. Type "glxinfo". On the second or third line it should say "Direct rendering: yes" .If so, it's working. Type "glxgears", and see what the frames per second is. Disable DRI by not loading the DRI driver in xorg.conf. Restart x and see what the FPS is now in glxgears. SHould be radical. Edit xorg.conf again, renable DRI, restart X, and try glxgears again...Wow. Big difference.

What if DRI is not running:

Try logging in as root, and trying. Your driver may not show that it is functional if you are not root. Also, look in /var/log/Xorg.0.log and see if there are any errors (EE) in there, and what was going on just prior to the error. That ought to steer you in the correct direction.

If this post helps, let us know. And, by the same token, if it doesn't let us know.
Let me know.
As things change, I will edit this.

This is not an all-emcompassing howto. It is just a general guide. Your experience may vary.

If you would like a copy of my xorg.conf, or my kernel config, they can be found here: www.cwwilson721.dyndns.org
The individual files are: xorg.conf and config

*************************************

NVidia:

READ YOUR DOCUMENTATION

Here's how to get 3D working with Nvidia:
  • Go to Nvidia.com, and download the appropriate driver for your card/chipset
  • Make sure you have kernel sources installed
  • Make sure X is NOT RUNNING
  • Log in as root
  • Run the install script. (Make sure it is executable first)
  • Answer "Yes" to everything. It will say a warning/error about "No kernel module found", but it will compile one.
  • If on a 64 bit system, LET IT INSTALL THE 32bit DRIVERS.
  • Done.
Really.

It's that easy.

If you don't want the Nvidia Splash screen when X starts, raska has this tidbit:
Quote:

Originally Posted by raska
from /usr/doc/NVIDIA_GLX-1.0/README.txt
Code:

Option "NoLogo" "boolean"

Disable drawing of the NVIDIA logo splash screen at X startup. Default:
the logo is drawn.


Just add the line in the /etc/X11/xorg.conf file, in the Device section. After the Driver "nvidia" line should work.


ATI:

READ YOUR DOCUMENTATION

I need more input here. What is the simplest, easiest way to get the ATi drivers installed? Most of the info here is old/outdated

The following is for Slackware 10.2 only. (Per Old_Fogie)

This was sent to me by Old_Fogie (Had to edit for length...Sorry dude. Read the complete post in this thread)

Quote:

Originally Posted by Old_Fogie
First off:

1. I installed slackware 10.2 with full install and the default kernel 2.4.
2. I never installed the testing kernel.
3. I installed/compiled a new kernel which is version 2.6.10

*********Note from cwwilson721: Get the complete kernel.config ouput here************

4. Once I got that working, I made a backup image using powerquest drive image so I allways had a place to go back to if I messed up the computer. Remember you are going to see an error at boot saying no AGP and that is correct as you have no video drivers installed. You are going to be running at 60hz flickery VESA mode. Once you get the drivers in, they run as Modules and load up and your AGP video card kick in just before X loads and get's into KDE desktop.

5. Then I went to ATI's website. I downloaded the XORG drivers there. The filename was fglrx_6_8_0-8.19.10-1.i386.rpm.


6. In the ATI readme/release notes, it clearly says that Posix shared memory must be enabled. I verified that the following line was in my Mepis ( I have an image of that too, so I had to go into windows to pull out that file and verify it), but this POSIX shared memory setting was NOT in my slackware; so I added the following to my /etc/fstab file.

tmpfs /dev/shm tmpfs defaults 0 0

then you have to mount this "tmpfs" drive by the following line in console:

Code:

root@pooter:~# mount /dev/shm
then you have to double-check that the mount was correct (again from the readme file) with :

Code:

root@pooter:~# mount | grep "shm"
I had no errors which ATI said was good.

6. Now to installing these drivers. I had saved them to my desktop.

I opened console, and switched to root.

Code:

root@pooter:/home/fogie/Desktop# rpm2tgz fglrx_6_8_0-8.19.10-1.i386.rpm
that line converts the RPM to a usable .tgz file for Slacker's.

then,

****Note from cwwilson721: get the full text of the install here)**********

NOTE: I did not do any manual module loading of chipsets or AGP stuff like Shilo lists in his "this is how I do it all" post.

Then I just went through the config program and rebooted. I used the defaults that the configure program gave to me. I figured I would play it the safeway.

Here are the respective items in my Xorg.conf file once modified by ATI.

*****Note from cwwilson721:(Get the xorg.conf file here)

That is what the default gives me. I'm sure there is some hard core tweaking that I have to do but I have not gotten to that yet.

Ok so I reboot. I notice errors for the no AGP as I mentioned previously, then X loads and my monitor changes frequency. I can hear it on my monitor it makes a pop sound. I clicked the button on it to see my monitor settings and it showed me I had 85 Hz. Very good sign. X finishes loading.

Once I'm in X I open up console.

I type in "glxinfo"

and the console returns this:

fogie@pooter:~$ glxinfo
name of display: :0.0
display: :0 screen: 0
direct rendering: Yes
server glx vendor string: SGI
server glx version string: 1.2
server glx extensions:

there's a bunch of other stuff listed but those few lines there is what is of concern.

****Note from cwwilson721:
This just sent in by Old_Fogie:

HEY ALL HEY ALL YOU GOTTA HEAR THIS:

I loaded in bare.i 2.4.31 kernel on my pc in a separate partition, as that's the clean load I use to make packages. Anyway, since I noticed ATI seems to be gettting better, I figured for kicks and giggles I'd see if they work on the default slack kerenel.

I installed the latest ATI drivers per the front page of this thread and I get DRI working. You do have to put the 'tmpfs' in the fstab section and mount it first like usual.

And additional info from Old_Fogie:
Quote:

Originally Posted by Old_Fogie
Hi all,

just wanted to post a note about linux ATI drivers, cwilson, you might want to add this in some way to my 'how-to'

the ATI drivers presently put (4) binary executables in /usr/X11R6/bin

they are:

fgl_glxgears # similar to glxgears, but I get about 1/5 of the fps in ATI's vs. the Xorg's

fglrx_xgamma # I don't know what this does

fglrxinfo # shows an output like this:

...........display: :0.0 screen: 0
...........OpenGL vendor string: ATI Technologies Inc.
...........OpenGL renderer string: RADEON 9600 XT Generic
...........OpenGL version string: 2.0.5879 (8.26.18)

and lastly

fireglcontrolpanel #let's you configure tv-out, red/blue/green output colors, gamma, etc which seems to have much better actual performance that the color correction bundled with the KDE one, and probably really useful if someone is using a window manager like fluxbox with no gui tool built into the window manager to adjust that stuff.

I have not found where (if any) symbolic links, or KDE menu icons are, or their corresponding .xpm files are.

Any advice on that is appreciated.

Any ATI guys found a way to adjust "contrast" ? not just gamma?

An additiom by
Quote:

Originally Posted by tronayne
The tube is supposed to be capable of 1600x1200 (Samsung says this is the max), 1280x1024, and so on down to butt-ugly. It's looking like the ATI driver will not support 1600x1200, but it defaults to 1792x1344, followed by 1280x1024 and so on.

And his fix:
Quote:

Originally Posted by tronayne

I figured it out.

The aticonfig utility (sort of equivalent to xorgconfig) makes a back up of the existing /etc/X11/xorg.conf file then adds entries to xorg.conf for the ATI drivers.

What it does not do (apparently) is query the display (as xorgconfig does) and set the ModelName, HorizSync and VertRefresh values in the ATI Monitor section of xorg.conf; I believe (could be wrong) that without those variables set the driver dynamically drives the display to its maximum capability and I needed to tone it down a little.

So, I got the specifications for my display and added them to the ATI Monitor section like this:

Section "Monitor"
Identifier "aticonfig-Monitor[0]"
Option "VendorName" "ATI Proprietary Driver"
#Option "ModelName" "Generic Autodetecting Monitor"
Option "ModelName" "SyncMaster"
HorizSync 30.0 - 85.0
VertRefresh 50.0 - 160.0
Option "DPMS" "true"
EndSection


and, wonder of wonders, I now have a vast array of Modes to select from (in KDE's Control Center) -- including the setting I really want, 1600x1200.

************************************
S3/Savage/Via

Kernel drivers is all you have. Run "xorgsetup", and you're done

************************************

Sorry about the editing and putting some of the text off-site, but this post is getting LONG....lol

Thanks again to all who contributed

stormtracknole 01-11-2006 08:42 AM

Thanks for the guide. I have the very same video card. I tried your configuration, but the X server did not come up. I have gotten DRI to work on the card, but I had to comment out the VideoRam section. Are you sure that it needs to be commented out?

Edit...I didn't have agp and the intel card compiled as a module. I'm going to re-compile the kernel (2.6.14.4) again.

cwwilson721 01-11-2006 08:49 AM

Yep. It needs at least 24 MB (I do believe) for DRI to be functional at the 24bit color depth. Look at my website www.cwwilson721.dyndns.org/slack and my complete kernel config file plus my entire xorg.conf file is available there. Look through them. My i830m chip is running glxgears at 740fps when without DRI was running at 125. And as I said, I run it at 24bit.

The details of the videoram, etc I got from the man i810 page. It has alot of info there

stormtracknole 01-11-2006 09:34 AM

Quote:

Originally Posted by cwwilson721
Yep. It needs at least 24 MB (I do believe) for DRI to be functional at the 24bit color depth. Look at my website www.cwwilson721.dyndns.org and my complete kernel config file plus my entire xorg.conf file is available there. Look through them. My i830m chip is running glxgears at 740fps when without DRI was running at 125. And as I said, I run it at 24bit.

The details of the videoram, etc I got from the man i810 page. It has alot of info there

Hmm...for some reason, it still didn't like the VideoRam section. Once I comment that out, the X server comes up. Also, the only way to achieve 3D on that card is on 16 bit, per the i810 manual. Maybe it's the difference between hardware since I have a PC and yours is a laptop. I've always had problems with this card for some reason. :confused:

cwwilson721 01-11-2006 09:42 AM

Quote:

Maybe it's the difference between hardware since I have a PC and yours is a laptop
Actually, the difference is probably the difference in BIOS. What works on mine may not work on yours, evan if you have a laptop.
Quote:

Also, the only way to achieve 3D on that card is on 16 bit, per the i810 manual
Obviously wrong. I'm running at 24 bit, and DRI is running good. Heres's my glxinfo:
Code:

user@toaster:~$ glxinfo
name of display: :0.0
display: :0  screen: 0
direct rendering: Yes

Color depth is 24bit.
Running at 1024x768
As I said, differences in bios.

Is your DRI running?

stormtracknole 01-11-2006 10:01 AM

Quote:

Originally Posted by cwwilson721
Actually, the difference is probably the difference in BIOS. What works on mine may not work on yours, evan if you have a laptop.

Obviously wrong. I'm running at 24 bit, and DRI is running good. Heres's my glxinfo:
Code:

user@toaster:~$ glxinfo
name of display: :0.0
display: :0  screen: 0
direct rendering: Yes

Color depth is 24bit.
Running at 1024x768
As I said, differences in bios.

Is your DRI running?

It must be the BIOS then. However, this is from the man i810,

Quote:

The driver supports hardware accelerated 3D via the Direct Rendering Infrastruc-
ture (DRI), but only in depth 16 for the i810/i815 and depths 16 and 24
for later chipsets.
I have the 810, not the 830, so I can only achieve 3D in 16 bit, again, per the manual. I have never been able to get DRI working at 24 bit on any distro. I'll fidle around with my BIOS when I get home. Thanks again for your help. Oh, and thanks for showing me your website. I made a link to it. :D

cwwilson721 01-11-2006 10:05 AM

Could be the i810 vs the i830. But beleive me, it's running 24bit and DRI works. I never knew you couldn't do it, so I went ahead and did it...lol

Quote:

Oh, and thanks for showing me your website. I made a link to it.
Thanks.

stormtracknole 01-11-2006 10:11 AM

Quote:

Originally Posted by cwwilson721
Could be the i810 vs the i830. But beleive me, it's running 24bit and DRI works. I never knew you couldn't do it, so I went ahead and did it...lol

Oh, I trust you. My BIOS is pretty old. I'll play around with it some more when I get home. Thanks again for your help.

berxwedan 01-11-2006 10:53 AM

Hey guys,

cwwilson721, excellent tip! I've been looking for a good tip on how to set up XVideo. (I'm pretty new to Linux.) Your post actually made me realize that my X was using the VESA Framebuffer and not the i810 driver.

Could I just add something?

I think this section in xorg.conf need to be changed too (change in bold):

Code:

Section "Screen"
    Identifier  "Screen 1"
    Device      "Intel 810"
    Monitor    "My Monitor"
[...]

To respond to (responding device in bold):

Code:

Section "Device"
  Identifier  "Intel 810"
  Driver      "i810"
  VideoRam    65536
  Option    "XVideo"        "On"
  #Option    "MonitorLayout"        "LFP,CRT"
  #Option    "DevicePresence"    "On"   
  # Insert Clocks lines here if appropriate
EndSection


Section "Device"
    Identifier  "VESA Framebuffer"
    Driver      "vesa"
    #VideoRam    4096
    # Insert Clocks lines here if appropriate
EndSection

The above are my settings. As you can see, I've commented out the "MonitorLayout" and the "DevicePresence" options. My screen turned black (even though X made it through, giving me the introduction sound and all, and I had to do a CTRL+ALT+BACKSPACE to exit.)

What does both those options mean?

VideoRam worked for me. (I've got a 24bit color depth, so I guess I needed it. Even though I don't know what it really does..)

Once again, thanks for the tip!

(I can watch DivX-movies without any problems in Xine now.. A Xine-Check doesn't give me any complaints anymore.)

cwwilson721 01-11-2006 11:06 AM

The monitor layout part is because the intel chip can drive two monitors at the same time. Change the 'LFP,CRT' to 'CRT,CRT'. the LFP means 'Local Flat Panel' (I have a laptop...lol). Changing it to CRT,CRT should work for you. If it's a LCD panel, look at man i810. It should say what to use.

The Device presence is to check IF you have another monitor on the second port. You can comment if you want. I do occasionally hookup to a crt or another monitor on my external port, so I leave it in.

In reference to the screen section, here is mine
Code:

Section "Screen"
    Identifier  "LCD"
    Device      "Intel 810"
    Monitor    "My Monitor"
    DefaultDepth 24

Sorry I did not post that. Look at my website, www.cwwilson721.dyndns.org/slack , and you can download my entire xorg.conf and config file for my kernel build.

Here is the link for the config file: http://www.cwwilson721.dyndns.org/slack/config-2.6.13h
And for the xorg.conf: http://www.cwwilson721.dyndns.org/slack/xorg.conf

Thanks for the questions/comments

stormtracknole 01-11-2006 11:08 AM

cwwilson,

Do you have any idea why commenting out the VideoRam section prevents the X server from coming up?

cwwilson721 01-11-2006 11:15 AM

Could be because you are asking for more ram than the video bios, or system bios, would allow you to have. You do/can run into BIOS issues with DRI. Sometimes, you just can't do it.

cwwilson721 01-11-2006 11:19 AM

stormtracknole-

Check the /var/log/Xorg.0 log, see if that gives you a clue

Synt4x_3rr0r 01-11-2006 11:38 AM

Oh, there's a guide for this now :D
I had to search like an idiot on google to just find bits of info on many websites.
Found a script that fixes the job for me, but it's only for nvidia cards.
If anyone wants it, I'll be happy to host the file as soon as i'm finished configuring my computer.

cwwilson721 01-11-2006 11:48 AM

I made the guide to help others who seem to have a problem with it. I also made it as general a possible so it maybe able to help as many as possible w/out getting into specific cards/bios/hardware (ALOT of pitfalls there).

But, go ahead and give us the script. The more, the merrier


All times are GMT -5. The time now is 11:15 PM.