Linux - HardwareThis forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I'm looking to get a decent graphics card capable of running mid-range games, such as starcraft 2 or minecraft when set at the "Far" distance. I am planning an AMD x6 1090T processor, Asus AM3+ mobo, 500+Watt PCU, 4GB, 1TB HDD with winXP and Fedora15 dual-booted. I have benn considering options such as the radeon HD 677, but I have heard that the drivers generally cause many problems and ati are to be avoided.
Also, I've considered an nvidia geforce GTX 460, but I've heard of recent problems with them as well, with stuttering. What to do? And would these be powerful enough? I am lost when it comes to GPU's
I would stick to nvidia cards. The drivers are much better in Linux than ATI. I have a gtx-460 and I have experienced stuttering but its caused by the Powermizer setting set to "adaptive" if you set it to "Prefer Maximum Performance" then the stuttering goes away. Once you make that simple change the card works flawlessly.
500 watts won't be enough with that CPU and a decent GPU.
My 1090T + Nvidia GTS 450 draw just shy of 230 watts each. That would leave no head room for other devices. An Nvidia GTS 460 draws even more power.
Look into a high quality 650+ watt PSU. Make sure the rails supply enough voltage to power both the CPU and the GPU. Some high watt PSU may claim 800 watts, but actually become some what useless with only 18-20 amps on each 12volt rail.
As far as video cards go, ATI/AMD and nVidia are always watching what the other company does. So for any given pirce range, you tend to get about the same performance from nVidia and ATI/AMD.
I haven had any serious issues with ATI/AMD cards with linux for ages, and IMO they are actually better than nVidia cards for linux (ATI/AMD support open source, nVidia doesnt). But there is always a chorus of 'dont use ATI/AMD cards' whenever this question pops up. In my experience, even though a lot of the people posting 'get nVidia, dont get ATI/AMD' dont dont have current experience with ATI/AMD cards, the general consensus of 'get nVidia' is followed. Odd, but I can see wy it happens.....
As far as nVidia cards with a sane price go, you've only got a few choices- GT430, GTS450, GTX550, GTX460 (various models).
The GTX460 is the fastest of them, GTSX550 is bit slower and a bit cheaper, GTS450 is a bit slower and a bit cheaper...GT430 is probably too slow for starcraft 2 to play well (depending on resolution).
If you can get one for a decent price and in a 'good' model, GTX460 would be the pick of them. It will use a fair bit more power and create more heat than the GTS450 though.
Quote:
Originally Posted by disturbed1
500 watts won't be enough with that CPU and a decent GPU.
My 1090T + Nvidia GTS 450 draw just shy of 230 watts each. That would leave no head room for other devices. An Nvidia GTS 460 draws even more power.
Look into a high quality 650+ watt PSU. Make sure the rails supply enough voltage to power both the CPU and the GPU. Some high watt PSU may claim 800 watts, but actually become some what useless with only 18-20 amps on each 12volt rail.
You are not going to need 650watts+ to run a GTS450 + 1090T. Been a while since I saw the toms hardware 'power consumption policy' (or whatever they called it) and finding it now isnt my idea of fun, so I cant 'prove' this, but IIRC they use a method that is at best flawed.
Have a look at xbitlabs for power consumtpion tests, they make a lot more sense. Heres the basic outline of how they do it (at the beginning of an article that IMO is also on the subject)-
You are not going to need 650watts+ to run a GTS450 + 1090T. Been a while since I saw the toms hardware 'power consumption policy' (or whatever they called it) and finding it now isnt my idea of fun, so I cant 'prove' this, but IIRC they use a method that is at best flawed.
Except for the maximum load simulation with OCCT, we measured power consumption in each mode for 60 seconds. We limit the run time of OCCT: GPU to 10 seconds to avoid overloading the graphics card's power circuitry.
Problems with AMD drivers -
I'm not touching this. As there is more than enough information out there. Even the AMD/ATI biased phoronix details many of the short comings AMD/ATI drivers have. Though, if you do want to play some games, and want to use an open source driver, AMD/ATI is the only choice. Nvidia just has better drivers. IMO AMD/ATI makes better silicon
Lastly would people please stop saying that AMD/ATI open sourced their driver, commits code to the open source driver, or does anything at all with the open source driver!!!!! The only thing AMD/ATI did was give open source developers some of white papers on some of their GPUs for some of the features.
BTW, Nvidia wrote and committed the NV open source driver. They also actively patch Xorg. Hell they just fixed a long standing Xrandr bug. Nvidia supports open source by and large. Nvidia is also a huge proponent of Android, after all, the cream of Android tablet crop runs on Nvidia tegra 2 silicon.
Using a wall power meter does not allow for inefficiencies in the power supply.
BTW, all your links show 'system power consumption' which is not 'CPU power consumption'. (BTW, the xbits lab link does as well, but its not from a wall meter). When you consider the different idle power used by motherboards, RAM, HDDs, optical drives and GPUs, and possibly different wall power meters, the differences are probably explainable.
Quote:
Originally Posted by disturbed1
Problems with AMD drivers -
I'm not touching this. As there is more than enough information out there. Even the AMD/ATI biased phoronix details many of the short comings AMD/ATI drivers have. Though, if you do want to play some games, and want to use an open source driver, AMD/ATI is the only choice. Nvidia just has better drivers. IMO AMD/ATI makes better silicon
'I'm not touching this', folowed by some touching?
I'm not saying the ATI/AMD drivers are perfect, or the nVidia drivers are horrible.
You can play games with the nouveau driver if you've added the 3D support (though I'd guess theres got to be more than a few games that will play just fine without the 3D support).
What I AM saying is that most of the people who do make comments on the ATI/AMD drivers, open and closed versions, have sometimes very little to base that idea on. Lots of times its just repeating what they have heard or seen without evidence, or based on experiences in 2009.....
Quote:
Originally Posted by disturbed1
Lastly would people please stop saying that AMD/ATI open sourced their driver, commits code to the open source driver, or does anything at all with the open source driver!!!!! The only thing AMD/ATI did was give open source developers some of white papers on some of their GPUs for some of the features.
But ATI/AMD has commited code to the open source driver, and has at least one person working on the open soruce drivers (and they were looking for more)-
AMD/ATI didnt just release some of the white papers...which is more than nVidia does/did anyway.
So, would people please check what they are saying, espically if they are going to use 5 exclamation marks?
Quote:
Originally Posted by disturbed1
BTW, Nvidia wrote and committed the NV open source driver. They also actively patch Xorg. Hell they just fixed a long standing Xrandr bug. Nvidia supports open source by and large. Nvidia is also a huge proponent of Android, after all, the cream of Android tablet crop runs on Nvidia tegra 2 silicon.
nVidia cut the NV driver-
Quote:
NVIDIA's open-source Linux efforts as it concerns their GPU support have historically been minimal. The xf86-video-nv driver has been around that provides very basic 2D acceleration and a crippled set of features besides that (no proper RandR 1.2/1.3, KMS, power management, etc) while the code has also been obfuscated to try to protect their intellectual property. However, NVIDIA has decided to deprecate this open-source driver of theirs. No, NVIDIA is not working on a new driver. No, NVIDIA is not going to support the Nouveau project. Instead, NVIDIA now just recommends its users use the X.Org VESA driver to get to NVIDIA.com when installing Linux so they can install their proprietary driver.
They provide no support for nouveau, all the work being done there is without nVidia helping at all (so its mostly from reverse engineering).
They can patch xorg as much as they want, if the community is left to develop the drivers without any support, thats not really 'supporting open source' IMO. You might have a different opinion.
Lastly would people please stop saying that AMD/ATI open sourced their driver, commits code to the open source driver, or does anything at all with the open source driver!!!!! The only thing AMD/ATI did was give open source developers some of white papers on some of their GPUs for some of the features.
They do pay developers outside of AMD to work on the open source drivers (starting with developers at Novell to work on xf86-video-radeonhd). They have also had their own developers work on the open source driver. Michel Dänzer, Christian König and Alex Deucher are all employed by AMD to work on the open source driver stack.
Quote:
BTW, Nvidia wrote and committed the NV open source driver.
The obfuscated NV driver. Which they've also stopped developing, and which never supported 3D acceleration.
As for problems with the nvidia drivers, let's not forget that they do have their fair share of issues:
This forum is filled with people with problems: tearing, computer lockups, X crashes, poor opengl performance, more computer lockups, resume/suspend issues, HDMI audio problems... Well, the list goes on.
My 1090T + Nvidia GTS 450 draw just shy of 230 watts each.
That is more or less the power they draw together, not each. AMD rates the 1090T with 125W maximum, NVidia rates the GTS450 with 109W maximum (they recommend at least a 400W PSU).
So if you have a good 500W PSU you should be fine. I run my 1055T (125W max) and a GTX260 (182W max according to NVidia, I would think more, because it is an overclocked version) with a cheap (and old) 650W PSU for quite some time now.
Woah... This is a lot to take in. So from the general standpoint, nVidia gtx 460 is the way to go?
GTX460 is a good card, but if its 'the way to go' would depend on pricing and what exactly you expect from a video card.
The GTS450 should be capable of playing the games you mentioned, though its possible that at high resolutions starcraft 2 might have a framerate slow enough to be 'bad' (less than 30FPS).
Quote:
Originally Posted by bplis
Where do I get nVidia drivers from?
From your distros repos, or from nvidia.
I'd suggest using the drivers in the repo, unless it causes problems.
Hey guys, looking at prices here, and realized that the radeon HD 6770 is about $50 cheaper. Will the card work well in the end? I'm on a tight budget, and money is a big issue, but I wouldn't spend money on something that doesn't work. Do they have similar speeds? If so I'll go with the HD 6770, as long as it will work
It can also be noted that you do not need a hexacore processor for gaming, and will not increase performance at all unless the game was designed to utilize all 6 cores. (SC2 and Minecraft DO NOT need).
Go with the x4 955 BE. Save a little money, and will be better suited for gaming as well as general computing.
The AMD Bulldozer processors should be announced in October, so wait a couple weeks for the current processor PRICE DROPS!
I don't just do gaming, I do a lot of compiling items from source, am starting with programming, download many files files, and like to do all of this simultaneously. I will use all of the processing corea, believe me. How much is the bulldozer expected to be? I may just wait and go with that...
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.