Workstation versus standard GPU (AMD)
I hope this is on-topic as most of the posts here seem to be about hardware issues rather than choosing hardware.
I'm looking at updating my my dual monitor setup and moving to a single 4k, with a thought of possibly adding another 4k later if I really miss the dual monitor setup. My current GPU, which is an ancient new old stock workstation card does not support 4k so I'm looking at buying a new card. I barely game, I've got Kerbal Space Programme (on another PC) but have not run it in a while for example and first person shooters give me motion sickness. I do sometimes do a little photo editing in Gimp, but don't do video editing and occasionally some work in Kicad. So I don't exactly use much GPU horsepower. On the other hand, maybe I don't because I have suffient GPU available to do little else anyway? Looking at graphics cards I can get an AMD Radeon Pro WX 2GB and a Radeon RX570 4GB for around the same price, or, jumping up a price point 4 and 8GB equivalents of each. A lot of sites online seem to cover the marketing blurb reasons for picking one or the other and maybe mention certification for applications I don't and probably will never have for the workstation GPU. Obviously I'm running linux and would stick with the open source drivers for an easier life when updating kernels (I frequently build newer kernels than my distro). It seems, feature wise, that I get some more sane socketry in general on workstation cards e.g. all display port rather than a mixture of display port, HDMI and DVI-D on a workstation card and maybe ECC memory and double precision maths, but at a cost of half the memory. If a newer GPU makes Virtual Box / Qemu desktops more responsive that is a win, but I'm not sure that it will? I tried and failed with GPU passthrough. So I wonder which is better, or given my usage does it not really matter? The other plan is just to by the cheapest card on eBay that supports the resolution and has sufficient connectivity. |
I Googled both of the cards you are considering.
The Radeon RX570 is much better than the Radeon Pro WX 2100. The former has a 256-bit memory interface and consumes 150W, which are typical of enthusiast graphics cards. I aim for those kind of specs. The latter is very entry-level: 64-bit memory interface and 35W power consumption. BTW, my six-year-old Radeon R7 265 drives my 4K monitor very well for desktop use. Desktop apps tend to do rendering on the CPU. Ed |
Given your usage, I would say 'it does not matter' - with one caveat (will get to it later). Basically, get the Radeon or whatever consumer card satisfies your needs (nVidia GeForce/Intel GMA also included here vs Quadro/Tesla/Radeon Pro/FireGL/etc) - you aren't mentioning anything that's really tied to the 'pro' card and really I don't see any reason to spend the extra money. If you're looking at ebay and finding some pro cards available super-duper cheap, that's another story, but bear in mind they may also be super-duper old (Wikipedia maintains lists of all graphics cards made by AMD and nVidia, and you can use this to line up a 'pro' card to its 'consumer' equivalent (which is very helpful with AMD's completely insane naming schema for their 'pro' cards).
As far as 4K output/video outputs in general, it depends on what the monitor/display supports - the reason for newer consumer cards to have HDMI is for HDMI 2.0, which supports 4K60 among other resolutions (essentially 'just as good' as DisplayPort apart from MST, which almost nothing uses). In my experience, I'd rather have an HDMI/DVI port available, as DP cabling tends to be all over the place (amazing what an 'open' and 'royalty free' standard enables...) while HDMI at least has the 'certified' badging on nicer cables (what I mean specifically here is: with DP cabling I've played roulette a few times just to find a cable that will do 4K60 or 1080p144 or similar, even from the same mfgr, because there's essentially no enforcement so you get a lot of junk - with HDMI there is a lot of junk, but there's also 'HDMI certified' cables that will do what they say (yes they do cost more) and sometimes that's an easier way to go). Finally, consider the kind of display you're using: a lot of modern 4K TVs do a great job as monitors (43" is a good size, for example), but will generally only accept HDMI 2.0, not DP 1.4. A lot of 4K 'monitors' offer a range of connections, not just DP. DP->something converters can also be a grab bag, so having a proper DVI or HDMI (they can convert between themselves passively, but you cannot get HDMI 2.0 from a DVI port, or HDMI 2.0 features over a DVI connector) can be useful there as well. No modern card does analog outputs, so if you need VGA for whatever, you'll need to get a somewhat older card (there are still new enough options that are 'modern' in terms of drivers and performance, but they'll be a few years old). Now regarding the above note: - ECC memory on the graphics card, I'd basically say 'so what?' and move on unless you have some use-case that you can argue with me about (in other words, 'if you have to ask, you probably don't need it...' applies here). - FP64 is supported everywhere, but what you're probably thinking about is 'does it run with decent performance' and the answer is basically 'no, not on anything new' because both nVidia and AMD have figured out they can use that as a segmentation feature, and chain it to their 'compute accelerator' products (which have great big pricetags) and the scientific/industrial users will fork over the cash. So if you need FP64 in OpenCL or CUDA or whatever, you can run it on your GeForce/Quadro/Radeon/Radeon Pro/FireGL/etc but if you want 1/2 or 1/3 FP64 performance you will need either A) an older card or B) a compute accelerator. For example: the original nVidia GeForce Titan cards did 1/2 FP64, as did some of the early GCN Radeon Pro and Radeon R9 stuff (some of those were 1/3 or 1/4 but still a lot better than modern 1/16 or 1/32 etc). - Modern 'pro cards' are kind of an interesting bird - they don't really have the compute features anymore, because that's been segmented out to compute accelerators, and they're generally pretty identical to the 'consumer' cards apart from their drivers. But in Linux, you aren't usually getting the fullblown driver application with all the ISV-certifications and whatnot that you get in Windows. The one big exception here, which I only know to be 'for certain' with nVidia, is that Quadro cards will enable Mosaic (both in Linux and Windows), which allows for GPU passthru and other things, while the GeForce card will actively try to block this functionality. If memory serves, AMD has not quite 'caught up' here, so Radeon should work more properly, but I can't say that for certain (or across all generations). Some generalized hardware recommendations, assuming you have both an x16 slot, and sufficient power: - For 4K output/support you will need either an nVidia Maxwell 2 (GeForce 900 series) or Radeon GCN 3 (Radeon Fury series) or newer. Some older cards (especially on the AMD side) can support 4K via HDMI 1.4 or DP1.x, but will not support HEVC/h265, and their internal video scalers cannot work with 4K (e.g. Radeon R9 290X can do 4K via HDMI 1.4 and DP1.x, but cannot scale video/rendering to full 4K, instead its limited to something like 3200x1800 internally - it also has no HEVC support). - There are a lot of 'cheap' or 'entry level' boards that will fit into the above and that will work fine in Linux (and/or Windows, if you ever need). I know Radeon has open source drivers, but I also know there tends to be a nasty lag between new hardware and full driver support (can be many months) - you can go dig through the 5700XT launch for an example. This isn't to say Radeon is bad, just bear in mind that open source means open source, for better or worse. By contrast, the nVidia drivers tend to 'just work' if you can get them installed. Having gone from a 2-3 monitor configuration (none higher than 1080p) to a single 4K 43" and living with both for at least a year, I'm back to a multi-head configuration with smaller monitors. Playing 'window tetris' just isn't for me. I personally cannot stand 'small' 4K monitors (e.g. 28" 4K) and I just don't understand their appeal, so I can't give you any guidance there. As far as a specific card to look for, since you don't really indicate you're doing much gaming or other 3D work, I'd probably go along with 'cheapest card on ebay that supports 4K and HEVC' as a goal - look for something from the Rx 4xx or Rx 5xx (Polaris) lines, or maybe even a Fury or Vega. If nVidia is able to be in the consideration, there's usually a lot of cheap GeForce 900s available - something like GTX 950 should be enough. To EdGr's point: I agree. If you want a Polaris 'pro' card you'd need to move up the line, like WX 7100. That said, if you aren't really doing gaming or heavy 3D work, why spend the money or deal with the size/cooling/power requirements of the beefier GPU? Generally the 'latest and greatest' in terms of video support, output, etc is available on more 'entry level' parts these days (e.g. GeForce GT 1030 is more or less a match for GTX 1080, until you fire up a game or CUDA app). |
Quote:
Quote:
0a:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Turks PRO [Radeon HD 6570/7570/8550] |
Quote:
|
Quote:
Ed |
Quote:
|
Quote:
Quote:
I did think at one point that a single 4k monitor at 32" was a good choice, it would give me a lot more screen area from a pixel perspective and about 50% greater resolution than I have now. But it looks like 27-28" is the sweet spot regarding value for 4k. I was considering whether a 1440p ultrawide would be a good move, I can basically match the horizontal resolution that I have with two monitors with 40% more height. That would be a step up. Back to 4K, at 27-28" I have enough physical space if I decide I do need to go back to dual display setup. I'm not convinced that a 43" monitor/TV is a good choice for my setup. Quote:
Quote:
Quote:
I can't remember exactly what the card in this machine is ( I have two, note) but lspci shows VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Turks PRO [Radeon HD 6570/7570/8550] It seems that supports DVI: 2560 x 1600 / DisplayPort: 2560 x 1600 / VGA: 2048 x 1536 (https://www.cnet.com/products/amd-ra...0-1-gb-series/) Mine is a 2 DP, 1xDVI model. (with analog VGA on the DVI.) Quote:
Thanks for that, really useful. I'd not even thought of HEVC as being relevant so that is useful. I was aiming for AMD as the open source drivers make sense, I understand. Proprietary drivers just add complexity when updating the kernel. I usually update kernels myself rather than use the distro one. Quote:
a 2700X processor and a new old stock 10 year old workstaton GPU. But then it depends what I am using it for. If I was missing something on desktop acceleration that I simply had not noticed owing to generally using older hardware on the graphics side. With a WX7100 being £625 here (I only looked on one website) it would have to make my life very much better in some way. I can't think what that way would be. Certainly I can't think how I could use it for the workloads I would like to speed up. |
RX460 RIP
I can count my existing RX460 (on my old PC) out of the equation. It appears completely dead and I am not going to risk installing that GPU in my good machine and risk that.
|
Quote:
Quote:
Ed |
The newest of these three is 7 years old:
Code:
# inxi -SGIxx Note all four are running the upstream default DDX, modesetting. My GeForce 210 that I paid $30 for brand new, with VGA, HDMI & DVI outputs, only supports 1440 via dual-link DVI (or VGA?), max 1920x1200 on HDMI. It's why I started shopping GPUs on eBay instead of new. I bought a 43" 4k LG TV once. It was capable of outputting from 4 DP/HDMI inputs simultaneously. I returned it to vendor after about 3 weeks, for several different reasons, including that HiDPI is a PITA that is readily avoided with 2K displays, but not so easy with smaller 4k screen sizes. HTH helps you save money. :) |
Quote:
Also note that this is a relatively recent feature for GPUs to be this power friendly - even going back a few generations it isn't uncommon to find 100W+ idle states, so bear this in mind if you go looking around on ebay for a bargain. Quote:
This isn't to say nVidia is 'much better' but there's at least a bit more logic within a single generation (e.g. GTX 1080 is safely assumed 'better' than GTX 1070, and GTX 980 is safely assumed to be a different generation). Wikipedia can be your friend here: https://en.wikipedia.org/wiki/List_o...ocessing_units (copious use of Ctrl-F will help). Quote:
Quote:
One thing to consider: I'm assuming you're going 100% digital connections here - DVI, HDMI, or DP in any combination. All of these connections on modern displays and graphics cards will support HDCP. VGA and other analog connections will not support HDCP, which can cause problems with some DRM protected content (e.g. online streaming video), regardless of the GPU supporting the feature. If HDCP does not matter for you, this distinction does not matter. Quote:
Quote:
I would pass on the ultrawide or super-ultrawide (21:9 or >21:9) monitors, and here's why: basically nothing is produced for that form factor. So any video will be pillar boxed or window boxed, and many games will even have issues at that resolution. If you're just running X applications (like browsers, word processors, terminal windows, etc) the price premium doesn't seem worthwhile vs getting 2+ standard monitors and having them side by side, because you're going to be playing window tetris to size things out. Basically, I would avoid things that aren't 'standard' - 1080p, 4K, and to a lesser extent 1440p are all pretty common, and therefore will be nicely compatible with lots of stuff, even if you're scaling (e.g. 1080p video scales nicely to 4K). Quote:
Quote:
FWIW the version # looks pretty up to date - my Xubuntu install shows 440.64. Quote:
I fully agree on 2560x1600 monitors being insanely expensive - I have no idea why the prices did not/have not fallen there. They were $1200-1600 (USD) in 2008, and that's roughly what they cost today. I don't understand it, especially when you can get a 1440p monitor for around a third of that. As far as <4K - most graphics cards back to at least the mid-2000s will support 1600p on down, so that may be easier to achieve, but I wouldn't explicitly rule out 4K support (or HEVC support) unless your budget is under $20. Quote:
Quote:
Quote:
Just from some quick looking at Wikipedia, something like the Radeon RX 560 would probably be a good candidate for you to consider - it should offer decent performance, is fairly modern, but not so cutting edge that you have to worry about immature drivers (at least in theory). I think it does 3 outputs total, has HEVC and 4K support, etc. If you need more performance, the RX 590 (as above) would probably be an easy upgrade, and is/was a fairly popular model so you should be able to find a range of them new or used, and with various aftermarket cooling, and documentation from people who have used them in a variety of settings. Can see more here: https://www.techpowerup.com/gpu-spec...n-rx-560.c2940 https://www.amd.com/en/products/graphics/radeon-rx-560 The RX 590 will basically be 'more of everything' at higher cost and higher power draw. If you want a 'pro' card, the WX 4100 (note the X) and WX 3200 look similar to the RX 560 in terms of generation, but both appear to cost quite a bit more (I quickly checked ebay), and in absolute terms are probably slower at some 3D tasks (like games). Finally, an off-the-wall idea to consider: if your motherboard has video outputs, there are AM4 Ryzen APUs that feature Vega graphics. You could in theory replace the 2700X with one of those, and have an all-in-one solution. I don't know if this would be a good or bad idea because I haven't followed the newer Ryzen chips very closely, so you may lose cores or some other feature to gain the GPU, but I know my older APUs are pretty competent as their own graphics cards (I have an A10-5800k in a desktop, and an A10-9620 in a laptop). One model I know of is the 'Ryzen 3400G.' Hope some of this at least helps. :) |
Firstly I'd like to thank yourself, EdGr and MrMazda. This has all been very useful information.
In the end I went with a 4k monitor, iiyama XUB2792UHSU-B1 and an RX590 GPU. I expect whichever way I went it would have been a decent solution. The Iiyama seems very good value. From reviews it appeared that if I wanted better colour fidelity for my photograph, which I don't do much of nowadays, I'd have to spend considerably more and if a gamer I'd have gone for a higher end card, and a faster 1440 monitor, so this seems sensible. I'll find out when it arrives. Regarding the GPU I suspect my RX460 would have been fine were it not dead. I went down the new route as it seems that second hand Polaris GPUs on eBay are barely cheaper than new ones and a RX590 is no dearer than an RX580 and only marginally more expensive than an 8GB RX570. But then I've fallen for the marketers who are pushing you to just spend that little bit more. Quote:
I realize that the trick on eBay is patience. However, I want a working card and monitor at the same time. Quote:
Also worth noting that the new 4k monitor is very likely cheaper than what I paid for the CRT, that's progress. Quote:
Quote:
Quote:
Quote:
* well, it probably does a bit more, but I've not needed to check. Quote:
I did breifly wonder about monitors that could have picture in picture, or place multiple side by side. But I expect that you have to pay a lot for that and unless you have a specific use changing that configuration around is likely a pain. Quote:
Quote:
Regarding the APU front. My motherboard has no video outs. I find it quite amazing that, dependent on workload, my 2 year old 2700X is being pushed quite a way down the performance rankings by processors just a generation newer. So I no longer have the latest and greatest, but this is all good for the consumer. It looks like the Ryzen 4700G, an eight core / sixteen thread APU is coming. But I'd have to start again with a new motherboard to use that. But I suspect that it is fantastic value if you need a powerful machine with modest graphics. |
Quote:
Quote:
At normal viewing distance, I can see an individual pixel on my 28" 4K (157 DPI, 0.161mm dot pitch). 4K is still not as good as human vision. The display manufacturers know that. It is not until 8K that displays will be better than human vision. Ed |
Quote:
Quote:
|
All times are GMT -5. The time now is 11:37 PM. |