LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Hardware
User Name
Password
Linux - Hardware This forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?

Notices


Reply
  Search this Thread
Old 09-22-2011, 11:42 PM   #1
bplis
Member
 
Registered: May 2011
Distribution: Fedora
Posts: 39

Rep: Reputation: Disabled
Decent graphics card?


I'm looking to get a decent graphics card capable of running mid-range games, such as starcraft 2 or minecraft when set at the "Far" distance. I am planning an AMD x6 1090T processor, Asus AM3+ mobo, 500+Watt PCU, 4GB, 1TB HDD with winXP and Fedora15 dual-booted. I have benn considering options such as the radeon HD 677, but I have heard that the drivers generally cause many problems and ati are to be avoided.
Also, I've considered an nvidia geforce GTX 460, but I've heard of recent problems with them as well, with stuttering. What to do? And would these be powerful enough? I am lost when it comes to GPU's
 
Old 09-23-2011, 01:42 AM   #2
Daedra
Senior Member
 
Registered: Dec 2005
Location: Springfield, MO
Distribution: Slackware64-15.0
Posts: 2,691

Rep: Reputation: 1377Reputation: 1377Reputation: 1377Reputation: 1377Reputation: 1377Reputation: 1377Reputation: 1377Reputation: 1377Reputation: 1377Reputation: 1377
I would stick to nvidia cards. The drivers are much better in Linux than ATI. I have a gtx-460 and I have experienced stuttering but its caused by the Powermizer setting set to "adaptive" if you set it to "Prefer Maximum Performance" then the stuttering goes away. Once you make that simple change the card works flawlessly.
 
Old 09-23-2011, 01:57 AM   #3
disturbed1
Senior Member
 
Registered: Mar 2005
Location: USA
Distribution: Slackware
Posts: 1,133
Blog Entries: 6

Rep: Reputation: 224Reputation: 224Reputation: 224
500 watts won't be enough with that CPU and a decent GPU.
My 1090T + Nvidia GTS 450 draw just shy of 230 watts each. That would leave no head room for other devices. An Nvidia GTS 460 draws even more power.

Look into a high quality 650+ watt PSU. Make sure the rails supply enough voltage to power both the CPU and the GPU. Some high watt PSU may claim 800 watts, but actually become some what useless with only 18-20 amps on each 12volt rail.

http://www.tomshardware.com/reviews/...x,2613-13.html
http://www.tomshardware.com/reviews/...i,2684-13.html

Same results here as Daedra. Here's a snapshot to show you how easy the setting is to change.
Attached Thumbnails
Click image for larger version

Name:	snapshot36.png
Views:	16
Size:	140.3 KB
ID:	8040  
 
Old 09-23-2011, 02:26 AM   #4
Daedra
Senior Member
 
Registered: Dec 2005
Location: Springfield, MO
Distribution: Slackware64-15.0
Posts: 2,691

Rep: Reputation: 1377Reputation: 1377Reputation: 1377Reputation: 1377Reputation: 1377Reputation: 1377Reputation: 1377Reputation: 1377Reputation: 1377Reputation: 1377
I agree with disturbed1. 500 watts is pushing it, but it should work if its a quality unit. You can use this power supply calculator, its pretty accurate. http://extreme.outervision.com/psucalculatorlite.jsp

Last edited by Daedra; 09-23-2011 at 12:31 PM.
 
Old 09-23-2011, 04:36 AM   #5
cascade9
Senior Member
 
Registered: Mar 2011
Location: Brisneyland
Distribution: Debian, aptosid
Posts: 3,753

Rep: Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935
As far as video cards go, ATI/AMD and nVidia are always watching what the other company does. So for any given pirce range, you tend to get about the same performance from nVidia and ATI/AMD.

I haven had any serious issues with ATI/AMD cards with linux for ages, and IMO they are actually better than nVidia cards for linux (ATI/AMD support open source, nVidia doesnt). But there is always a chorus of 'dont use ATI/AMD cards' whenever this question pops up. In my experience, even though a lot of the people posting 'get nVidia, dont get ATI/AMD' dont dont have current experience with ATI/AMD cards, the general consensus of 'get nVidia' is followed. Odd, but I can see wy it happens.....

As far as nVidia cards with a sane price go, you've only got a few choices- GT430, GTS450, GTX550, GTX460 (various models).

The GTX460 is the fastest of them, GTSX550 is bit slower and a bit cheaper, GTS450 is a bit slower and a bit cheaper...GT430 is probably too slow for starcraft 2 to play well (depending on resolution).

If you can get one for a decent price and in a 'good' model, GTX460 would be the pick of them. It will use a fair bit more power and create more heat than the GTS450 though.

Quote:
Originally Posted by disturbed1 View Post
500 watts won't be enough with that CPU and a decent GPU.
My 1090T + Nvidia GTS 450 draw just shy of 230 watts each. That would leave no head room for other devices. An Nvidia GTS 460 draws even more power.

Look into a high quality 650+ watt PSU. Make sure the rails supply enough voltage to power both the CPU and the GPU. Some high watt PSU may claim 800 watts, but actually become some what useless with only 18-20 amps on each 12volt rail.

http://www.tomshardware.com/reviews/...x,2613-13.html
http://www.tomshardware.com/reviews/...i,2684-13.html
You are not going to need 650watts+ to run a GTS450 + 1090T. Been a while since I saw the toms hardware 'power consumption policy' (or whatever they called it) and finding it now isnt my idea of fun, so I cant 'prove' this, but IIRC they use a method that is at best flawed.

Have a look at xbitlabs for power consumtpion tests, they make a lot more sense. Heres the basic outline of how they do it (at the beginning of an article that IMO is also on the subject)-

http://www.xbitlabs.com/articles/cas...m-wattage.html

I'd believe xbitslabs over some insane 230watt figure for the GTS450-

http://www.xbitlabs.com/articles/gra...0_4.html#sect0

I'd also believe xbit labs over 230watts for the 1090T (230watts is less insane for a 1090T than a GTS450, but its still wrong IMO)-

http://www.xbitlabs.com/articles/cpu...t_9.html#sect0

This is also worth a read if you have any intrest in power consumption-

http://www.xbitlabs.com/articles/cpu...rclocking.html
 
1 members found this post helpful.
Old 09-23-2011, 06:14 AM   #6
disturbed1
Senior Member
 
Registered: Mar 2005
Location: USA
Distribution: Slackware
Posts: 1,133
Blog Entries: 6

Rep: Reputation: 224Reputation: 224Reputation: 224
Quote:
Originally Posted by cascade9 View Post
You are not going to need 650watts+ to run a GTS450 + 1090T. Been a while since I saw the toms hardware 'power consumption policy' (or whatever they called it) and finding it now isnt my idea of fun, so I cant 'prove' this, but IIRC they use a method that is at best flawed.
xbitslabs is the only place reporting that low of a power draw for the 1090t
http://www.techpowerup.com/reviews/A..._1090T/11.html 228 watts
http://www.legitreviews.com/article/1289/19/ 191 watts
http://www.anandtech.com/show/3674/a...5t-reviewed/10 201 watts
http://www.hardwarecanucks.com/forum...review-15.html 219 watts
http://techreport.com/articles.x/18799/4 210 watts
http://www.techspot.com/review/269-a...55T/page9.html 202 watts

Quote:
Except for the maximum load simulation with OCCT, we measured power consumption in each mode for 60 seconds. We limit the run time of OCCT: GPU to 10 seconds to avoid overloading the graphics card's power circuitry.
So they only test for 60 seconds? And then only for 10 seconds?
http://www.tomshardware.com/reviews/...ds,2849-5.html
^^ That's how Toms Hardware does their power consumption tests.

Problems with AMD drivers -
I'm not touching this. As there is more than enough information out there. Even the AMD/ATI biased phoronix details many of the short comings AMD/ATI drivers have. Though, if you do want to play some games, and want to use an open source driver, AMD/ATI is the only choice. Nvidia just has better drivers. IMO AMD/ATI makes better silicon

Lastly would people please stop saying that AMD/ATI open sourced their driver, commits code to the open source driver, or does anything at all with the open source driver!!!!! The only thing AMD/ATI did was give open source developers some of white papers on some of their GPUs for some of the features.

BTW, Nvidia wrote and committed the NV open source driver. They also actively patch Xorg. Hell they just fixed a long standing Xrandr bug. Nvidia supports open source by and large. Nvidia is also a huge proponent of Android, after all, the cream of Android tablet crop runs on Nvidia tegra 2 silicon.
 
Old 09-23-2011, 07:04 AM   #7
cascade9
Senior Member
 
Registered: Mar 2011
Location: Brisneyland
Distribution: Debian, aptosid
Posts: 3,753

Rep: Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935
Quote:
Originally Posted by disturbed1 View Post
http://www.tomshardware.com/reviews/...ds,2849-5.html
^^ That's how Toms Hardware does their power consumption tests.
Using a wall power meter does not allow for inefficiencies in the power supply.

BTW, all your links show 'system power consumption' which is not 'CPU power consumption'. (BTW, the xbits lab link does as well, but its not from a wall meter). When you consider the different idle power used by motherboards, RAM, HDDs, optical drives and GPUs, and possibly different wall power meters, the differences are probably explainable.

Quote:
Originally Posted by disturbed1 View Post
Problems with AMD drivers -
I'm not touching this. As there is more than enough information out there. Even the AMD/ATI biased phoronix details many of the short comings AMD/ATI drivers have. Though, if you do want to play some games, and want to use an open source driver, AMD/ATI is the only choice. Nvidia just has better drivers. IMO AMD/ATI makes better silicon
'I'm not touching this', folowed by some touching?

I'm not saying the ATI/AMD drivers are perfect, or the nVidia drivers are horrible.

You can play games with the nouveau driver if you've added the 3D support (though I'd guess theres got to be more than a few games that will play just fine without the 3D support).

What I AM saying is that most of the people who do make comments on the ATI/AMD drivers, open and closed versions, have sometimes very little to base that idea on. Lots of times its just repeating what they have heard or seen without evidence, or based on experiences in 2009.....

Quote:
Originally Posted by disturbed1 View Post
Lastly would people please stop saying that AMD/ATI open sourced their driver, commits code to the open source driver, or does anything at all with the open source driver!!!!! The only thing AMD/ATI did was give open source developers some of white papers on some of their GPUs for some of the features.
But ATI/AMD has commited code to the open source driver, and has at least one person working on the open soruce drivers (and they were looking for more)-

http://www.phoronix.com/scan.php?pag...item&px=OTQzNQ

AMD/ATI didnt just release some of the white papers...which is more than nVidia does/did anyway.

So, would people please check what they are saying, espically if they are going to use 5 exclamation marks?

Quote:
Originally Posted by disturbed1 View Post
BTW, Nvidia wrote and committed the NV open source driver. They also actively patch Xorg. Hell they just fixed a long standing Xrandr bug. Nvidia supports open source by and large. Nvidia is also a huge proponent of Android, after all, the cream of Android tablet crop runs on Nvidia tegra 2 silicon.
nVidia cut the NV driver-

Quote:
NVIDIA's open-source Linux efforts as it concerns their GPU support have historically been minimal. The xf86-video-nv driver has been around that provides very basic 2D acceleration and a crippled set of features besides that (no proper RandR 1.2/1.3, KMS, power management, etc) while the code has also been obfuscated to try to protect their intellectual property. However, NVIDIA has decided to deprecate this open-source driver of theirs. No, NVIDIA is not working on a new driver. No, NVIDIA is not going to support the Nouveau project. Instead, NVIDIA now just recommends its users use the X.Org VESA driver to get to NVIDIA.com when installing Linux so they can install their proprietary driver.
http://www.phoronix.com/scan.php?pag...kills_nv&num=1

They provide no support for nouveau, all the work being done there is without nVidia helping at all (so its mostly from reverse engineering).

They can patch xorg as much as they want, if the community is left to develop the drivers without any support, thats not really 'supporting open source' IMO. You might have a different opinion.

Last edited by cascade9; 09-23-2011 at 07:06 AM.
 
Old 09-23-2011, 07:08 AM   #8
adamk75
Senior Member
 
Registered: May 2006
Posts: 3,091

Rep: Reputation: 399Reputation: 399Reputation: 399Reputation: 399
Quote:
Originally Posted by disturbed1 View Post
Lastly would people please stop saying that AMD/ATI open sourced their driver, commits code to the open source driver, or does anything at all with the open source driver!!!!! The only thing AMD/ATI did was give open source developers some of white papers on some of their GPUs for some of the features.
They do pay developers outside of AMD to work on the open source drivers (starting with developers at Novell to work on xf86-video-radeonhd). They have also had their own developers work on the open source driver. Michel Dänzer, Christian König and Alex Deucher are all employed by AMD to work on the open source driver stack.

Quote:
BTW, Nvidia wrote and committed the NV open source driver.
The obfuscated NV driver. Which they've also stopped developing, and which never supported 3D acceleration.

As for problems with the nvidia drivers, let's not forget that they do have their fair share of issues:

http://nvnews.net/vbulletin/forumdisplay.php?f=14

This forum is filled with people with problems: tearing, computer lockups, X crashes, poor opengl performance, more computer lockups, resume/suspend issues, HDMI audio problems... Well, the list goes on.

Adam
 
1 members found this post helpful.
Old 09-23-2011, 10:07 AM   #9
TobiSGD
Moderator
 
Registered: Dec 2009
Location: Germany
Distribution: Whatever fits the task best
Posts: 17,148
Blog Entries: 2

Rep: Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886
Quote:
Originally Posted by disturbed1 View Post
My 1090T + Nvidia GTS 450 draw just shy of 230 watts each.
That is more or less the power they draw together, not each. AMD rates the 1090T with 125W maximum, NVidia rates the GTS450 with 109W maximum (they recommend at least a 400W PSU).
So if you have a good 500W PSU you should be fine. I run my 1055T (125W max) and a GTX260 (182W max according to NVidia, I would think more, because it is an overclocked version) with a cheap (and old) 650W PSU for quite some time now.
 
1 members found this post helpful.
Old 09-23-2011, 03:31 PM   #10
bplis
Member
 
Registered: May 2011
Distribution: Fedora
Posts: 39

Original Poster
Rep: Reputation: Disabled
Woah... This is a lot to take in. So from the general standpoint, nVidia gtx 460 is the way to go? Where do I get nVidia drivers from?
 
Old 09-24-2011, 05:07 AM   #11
cascade9
Senior Member
 
Registered: Mar 2011
Location: Brisneyland
Distribution: Debian, aptosid
Posts: 3,753

Rep: Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935
Quote:
Originally Posted by bplis View Post
Woah... This is a lot to take in. So from the general standpoint, nVidia gtx 460 is the way to go?
GTX460 is a good card, but if its 'the way to go' would depend on pricing and what exactly you expect from a video card.

The GTS450 should be capable of playing the games you mentioned, though its possible that at high resolutions starcraft 2 might have a framerate slow enough to be 'bad' (less than 30FPS).

Quote:
Originally Posted by bplis View Post
Where do I get nVidia drivers from?
From your distros repos, or from nvidia.

I'd suggest using the drivers in the repo, unless it causes problems.
 
Old 09-27-2011, 05:49 PM   #12
bplis
Member
 
Registered: May 2011
Distribution: Fedora
Posts: 39

Original Poster
Rep: Reputation: Disabled
Hey guys, looking at prices here, and realized that the radeon HD 6770 is about $50 cheaper. Will the card work well in the end? I'm on a tight budget, and money is a big issue, but I wouldn't spend money on something that doesn't work. Do they have similar speeds? If so I'll go with the HD 6770, as long as it will work
 
Old 09-28-2011, 05:47 AM   #13
cascade9
Senior Member
 
Registered: Mar 2011
Location: Brisneyland
Distribution: Debian, aptosid
Posts: 3,753

Rep: Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935
HD6770 is slightly faster than a GTS450, but slower than a GTX460.
 
Old 09-28-2011, 03:56 PM   #14
snapshot15
LQ Newbie
 
Registered: Aug 2011
Distribution: Mint, Debian, CentOS
Posts: 7

Rep: Reputation: Disabled
It can also be noted that you do not need a hexacore processor for gaming, and will not increase performance at all unless the game was designed to utilize all 6 cores. (SC2 and Minecraft DO NOT need).

Go with the x4 955 BE. Save a little money, and will be better suited for gaming as well as general computing.

The AMD Bulldozer processors should be announced in October, so wait a couple weeks for the current processor PRICE DROPS!
 
Old 09-29-2011, 06:05 PM   #15
bplis
Member
 
Registered: May 2011
Distribution: Fedora
Posts: 39

Original Poster
Rep: Reputation: Disabled
I don't just do gaming, I do a lot of compiling items from source, am starting with programming, download many files files, and like to do all of this simultaneously. I will use all of the processing corea, believe me. How much is the bulldozer expected to be? I may just wait and go with that...
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Dual monitors: inbuilt graphics card & external graphics card fanofai Linux - Newbie 1 06-07-2009 07:31 AM
S3 SuperSavage IXC 64 SDL graphics card Opensuse 11.1 slow graphics Infasoft SUSE / openSUSE 0 03-12-2009 09:07 AM
Bad graphics: ATI raedon graphics card and slackware shady_Dev Linux - Hardware 1 05-22-2008 06:00 AM
using rhel5 cant get decent graphics resolution. supermal Red Hat 5 07-13-2007 11:01 AM
Decent Solaris 9 PCI Graphics Card Sugarat Solaris / OpenSolaris 1 12-15-2003 09:29 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Hardware

All times are GMT -5. The time now is 03:21 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration