SlackwareThis Forum is for the discussion of Slackware Linux.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Website is back down. Some off-site links WON'T work.
NEED HELP W/ATi. Make posts, please! ATi section is far out of date!
UPDATED: Now includes ATI, NVidia, VIA\S3 Savage, and Intel!!
You can also vote on this thread. Let me know how it is.
I have seen ALOT of questions concerning this issue.
After starting this thread, I have also noticed alot of people reading this. If you have ANY experience, good or bad, trying to enable DRI on your card/chip, please post them. All experiences can be learned from.
This is also a constantly changing post. Come back and read it again. I'm adding/removing/changing the post constantly, to try to reflect new information as it becomes revealed.
But first, a disclaimer:
READ YOUR DOCUMENTATION
I use Slackware 13 64bit, Multi-lib working. I do not know if any of the following will definitely work in any other distribution!
Also, be aware that this may not work on your hardware.
My experience with this issue is mostly with the i810 family of chipsets, and now extensively Nvidia.
But, as a general guide, it can be applied to most circumstances.
Check the various sections for your chipset/video card
But remember: Sometimes, because of BIOS limitations, or the card/chip itself, it just won't work
Links to external sites for various related topics:
Direct Rendering (DRI) is dependant on many things, among them are kernel support, BIOS (both video and system), memory for the video (either onboard the card itself, or shared system memory), your xorg.conf, and the driver for your card/chip itself.
There are three things that are generally needed for 3D Acceleration (DRI) to take place:
Correct drivers for your video card/chipset. In the case of Intel, most use the supplied i810 driver in X. It will work fine. Some ATI and almost ALL NVidia cards/chips need their own drivers. Go ahead and download them. DO NOT COMPILE/INSTALL YET. Unless you want to do it again later. (I prefer the compile/install once method. You might like to do it multiple times. It's your life. Do as you wish)
Recompile the kernel. As long as you're at it, use the newest from www.kernel.org or the one on the second cd, whichever you want.
Edit /etc/X11/xorg.conf
Those are the quickies. Now for details
Correct Drivers:
IF YOU WANT 3D ACCELERATION FOR NVIDIA/ATI,
YOU MUST USE THE PROPRIETARY DRIVERS.
The open source/kernel drivers DO NOT SUPPORT 3D
Download them. Borrow from a friend. Steal them from your roommate/brother/sister/that weird guy from down the street.
As I said, with the Intel card/chip, the ones installed w/X11 are fine (i810). ATI (if needed)and all NVidia, get them.
Edit /etc/X11/xorg.conf
Now come the variables. There are three general things that must be done (Your hardware driver may not allow these,check your documentation):
Load the DRI driver. Make sure this is showing:
Code:
# This loads the GLX module
Load "glx"
# This loads the DRI module
Load "dri"
*** Some higher end cards do not want/will not work with DRI module loaded. READ YOUR DOCUMENTATION
Memory. Must have enough. in the "Graphics device section", find your card. Here is mine as an example:
*** Note: these are from MY xorg.conf. I do not know if they will work with your hardware.
Also, the "LFP,CRT" is for my laptop, and the i810 driver. You may have to change it to "CRT,CRT" for yours to work. Check the man i810 pages. ***
At the end of the file, look for this,and add it if you don't have it:
Code:
Section "DRI"
Mode 0666
EndSection
You must have the Mode 0666 if you want anybody else to have DRI but root.
There maybe other tweaks that are needed, depending on your configuration and needs.
Checking if DRI is working:
Restart your X session, (restart it, reboot, whatever). When X is up, get a terminal window open. Type "glxinfo". On the second or third line it should say "Direct rendering: yes" .If so, it's working. Type "glxgears", and see what the frames per second is. Disable DRI by not loading the DRI driver in xorg.conf. Restart x and see what the FPS is now in glxgears. SHould be radical. Edit xorg.conf again, renable DRI, restart X, and try glxgears again...Wow. Big difference.
What if DRI is not running:
Try logging in as root, and trying. Your driver may not show that it is functional if you are not root. Also, look in /var/log/Xorg.0.log and see if there are any errors (EE) in there, and what was going on just prior to the error. That ought to steer you in the correct direction.
If this post helps, let us know. And, by the same token, if it doesn't let us know.
Let me know.
As things change, I will edit this.
This is not an all-emcompassing howto. It is just a general guide. Your experience may vary.
1. I installed slackware 10.2 with full install and the default kernel 2.4.
2. I never installed the testing kernel.
3. I installed/compiled a new kernel which is version 2.6.10
4. Once I got that working, I made a backup image using powerquest drive image so I allways had a place to go back to if I messed up the computer. Remember you are going to see an error at boot saying no AGP and that is correct as you have no video drivers installed. You are going to be running at 60hz flickery VESA mode. Once you get the drivers in, they run as Modules and load up and your AGP video card kick in just before X loads and get's into KDE desktop.
5. Then I went to ATI's website. I downloaded the XORG drivers there. The filename was fglrx_6_8_0-8.19.10-1.i386.rpm.
6. In the ATI readme/release notes, it clearly says that Posix shared memory must be enabled. I verified that the following line was in my Mepis ( I have an image of that too, so I had to go into windows to pull out that file and verify it), but this POSIX shared memory setting was NOT in my slackware; so I added the following to my /etc/fstab file.
tmpfs /dev/shm tmpfs defaults 0 0
then you have to mount this "tmpfs" drive by the following line in console:
Code:
root@pooter:~# mount /dev/shm
then you have to double-check that the mount was correct (again from the readme file) with :
Code:
root@pooter:~# mount | grep "shm"
I had no errors which ATI said was good.
6. Now to installing these drivers. I had saved them to my desktop.
NOTE: I did not do any manual module loading of chipsets or AGP stuff like Shilo lists in his "this is how I do it all" post.
Then I just went through the config program and rebooted. I used the defaults that the configure program gave to me. I figured I would play it the safeway.
Here are the respective items in my Xorg.conf file once modified by ATI.
That is what the default gives me. I'm sure there is some hard core tweaking that I have to do but I have not gotten to that yet.
Ok so I reboot. I notice errors for the no AGP as I mentioned previously, then X loads and my monitor changes frequency. I can hear it on my monitor it makes a pop sound. I clicked the button on it to see my monitor settings and it showed me I had 85 Hz. Very good sign. X finishes loading.
Once I'm in X I open up console.
I type in "glxinfo"
and the console returns this:
fogie@pooter:~$ glxinfo
name of display: :0.0
display: :0 screen: 0
direct rendering: Yes
server glx vendor string: SGI
server glx version string: 1.2
server glx extensions:
there's a bunch of other stuff listed but those few lines there is what is of concern.
****Note from cwwilson721:
This just sent in by Old_Fogie:
HEY ALL HEY ALL YOU GOTTA HEAR THIS:
I loaded in bare.i 2.4.31 kernel on my pc in a separate partition, as that's the clean load I use to make packages. Anyway, since I noticed ATI seems to be gettting better, I figured for kicks and giggles I'd see if they work on the default slack kerenel.
I installed the latest ATI drivers per the front page of this thread and I get DRI working. You do have to put the 'tmpfs' in the fstab section and mount it first like usual.
And additional info from Old_Fogie:
Quote:
Originally Posted by Old_Fogie
Hi all,
just wanted to post a note about linux ATI drivers, cwilson, you might want to add this in some way to my 'how-to'
the ATI drivers presently put (4) binary executables in /usr/X11R6/bin
they are:
fgl_glxgears # similar to glxgears, but I get about 1/5 of the fps in ATI's vs. the Xorg's
fglrx_xgamma # I don't know what this does
fglrxinfo # shows an output like this:
...........display: :0.0 screen: 0
...........OpenGL vendor string: ATI Technologies Inc.
...........OpenGL renderer string: RADEON 9600 XT Generic
...........OpenGL version string: 2.0.5879 (8.26.18)
and lastly
fireglcontrolpanel #let's you configure tv-out, red/blue/green output colors, gamma, etc which seems to have much better actual performance that the color correction bundled with the KDE one, and probably really useful if someone is using a window manager like fluxbox with no gui tool built into the window manager to adjust that stuff.
I have not found where (if any) symbolic links, or KDE menu icons are, or their corresponding .xpm files are.
Any advice on that is appreciated.
Any ATI guys found a way to adjust "contrast" ? not just gamma?
An additiom by
Quote:
Originally Posted by tronayne
The tube is supposed to be capable of 1600x1200 (Samsung says this is the max), 1280x1024, and so on down to butt-ugly. It's looking like the ATI driver will not support 1600x1200, but it defaults to 1792x1344, followed by 1280x1024 and so on.
And his fix:
Quote:
Originally Posted by tronayne
I figured it out.
The aticonfig utility (sort of equivalent to xorgconfig) makes a back up of the existing /etc/X11/xorg.conf file then adds entries to xorg.conf for the ATI drivers.
What it does not do (apparently) is query the display (as xorgconfig does) and set the ModelName, HorizSync and VertRefresh values in the ATI Monitor section of xorg.conf; I believe (could be wrong) that without those variables set the driver dynamically drives the display to its maximum capability and I needed to tone it down a little.
So, I got the specifications for my display and added them to the ATI Monitor section like this:
Thanks for the guide. I have the very same video card. I tried your configuration, but the X server did not come up. I have gotten DRI to work on the card, but I had to comment out the VideoRam section. Are you sure that it needs to be commented out?
Edit...I didn't have agp and the intel card compiled as a module. I'm going to re-compile the kernel (2.6.14.4) again.
Last edited by stormtracknole; 01-11-2006 at 08:45 AM.
Yep. It needs at least 24 MB (I do believe) for DRI to be functional at the 24bit color depth. Look at my website www.cwwilson721.dyndns.org/slack and my complete kernel config file plus my entire xorg.conf file is available there. Look through them. My i830m chip is running glxgears at 740fps when without DRI was running at 125. And as I said, I run it at 24bit.
The details of the videoram, etc I got from the man i810 page. It has alot of info there
Last edited by cwwilson721; 01-13-2006 at 07:44 AM.
Yep. It needs at least 24 MB (I do believe) for DRI to be functional at the 24bit color depth. Look at my website www.cwwilson721.dyndns.org and my complete kernel config file plus my entire xorg.conf file is available there. Look through them. My i830m chip is running glxgears at 740fps when without DRI was running at 125. And as I said, I run it at 24bit.
The details of the videoram, etc I got from the man i810 page. It has alot of info there
Hmm...for some reason, it still didn't like the VideoRam section. Once I comment that out, the X server comes up. Also, the only way to achieve 3D on that card is on 16 bit, per the i810 manual. Maybe it's the difference between hardware since I have a PC and yours is a laptop. I've always had problems with this card for some reason.
Actually, the difference is probably the difference in BIOS. What works on mine may not work on yours, evan if you have a laptop.
Obviously wrong. I'm running at 24 bit, and DRI is running good. Heres's my glxinfo:
Code:
user@toaster:~$ glxinfo
name of display: :0.0
display: :0 screen: 0
direct rendering: Yes
Color depth is 24bit.
Running at 1024x768
As I said, differences in bios.
Is your DRI running?
It must be the BIOS then. However, this is from the man i810,
Quote:
The driver supports hardware accelerated 3D via the Direct Rendering Infrastruc-
ture (DRI), but only in depth 16 for the i810/i815 and depths 16 and 24
for later chipsets.
I have the 810, not the 830, so I can only achieve 3D in 16 bit, again, per the manual. I have never been able to get DRI working at 24 bit on any distro. I'll fidle around with my BIOS when I get home. Thanks again for your help. Oh, and thanks for showing me your website. I made a link to it.
cwwilson721, excellent tip! I've been looking for a good tip on how to set up XVideo. (I'm pretty new to Linux.) Your post actually made me realize that my X was using the VESA Framebuffer and not the i810 driver.
Could I just add something?
I think this section in xorg.conf need to be changed too (change in bold):
The above are my settings. As you can see, I've commented out the "MonitorLayout" and the "DevicePresence" options. My screen turned black (even though X made it through, giving me the introduction sound and all, and I had to do a CTRL+ALT+BACKSPACE to exit.)
What does both those options mean?
VideoRam worked for me. (I've got a 24bit color depth, so I guess I needed it. Even though I don't know what it really does..)
Once again, thanks for the tip!
(I can watch DivX-movies without any problems in Xine now.. A Xine-Check doesn't give me any complaints anymore.)
The monitor layout part is because the intel chip can drive two monitors at the same time. Change the 'LFP,CRT' to 'CRT,CRT'. the LFP means 'Local Flat Panel' (I have a laptop...lol). Changing it to CRT,CRT should work for you. If it's a LCD panel, look at man i810. It should say what to use.
The Device presence is to check IF you have another monitor on the second port. You can comment if you want. I do occasionally hookup to a crt or another monitor on my external port, so I leave it in.
Sorry I did not post that. Look at my website, www.cwwilson721.dyndns.org/slack , and you can download my entire xorg.conf and config file for my kernel build.
Could be because you are asking for more ram than the video bios, or system bios, would allow you to have. You do/can run into BIOS issues with DRI. Sometimes, you just can't do it.
Last edited by cwwilson721; 01-11-2006 at 11:16 AM.
Oh, there's a guide for this now
I had to search like an idiot on google to just find bits of info on many websites.
Found a script that fixes the job for me, but it's only for nvidia cards.
If anyone wants it, I'll be happy to host the file as soon as i'm finished configuring my computer.
I made the guide to help others who seem to have a problem with it. I also made it as general a possible so it maybe able to help as many as possible w/out getting into specific cards/bios/hardware (ALOT of pitfalls there).
But, go ahead and give us the script. The more, the merrier
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.