LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Hardware
User Name
Password
Linux - Hardware This forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?

Notices


Reply
  Search this Thread
Old 06-20-2020, 09:09 AM   #1
petejc
Member
 
Registered: Apr 2019
Distribution: Slackware
Posts: 134

Rep: Reputation: Disabled
Workstation versus standard GPU (AMD)


I hope this is on-topic as most of the posts here seem to be about hardware issues rather than choosing hardware.

I'm looking at updating my my dual monitor setup and moving to a single 4k, with a thought of possibly adding another 4k later if I really miss the dual monitor setup.

My current GPU, which is an ancient new old stock workstation card does not support 4k so I'm looking at buying a new card.

I barely game, I've got Kerbal Space Programme (on another PC) but have not run it in a while for example and first person shooters give me motion sickness. I do sometimes do a little photo editing in Gimp, but don't do video editing and occasionally some work in Kicad. So I don't exactly use much GPU horsepower. On the other hand, maybe I don't because I have suffient GPU available to do little else anyway?

Looking at graphics cards I can get an AMD Radeon Pro WX 2GB and a Radeon RX570 4GB for around the same price, or, jumping up a price point 4 and 8GB equivalents of each.

A lot of sites online seem to cover the marketing blurb reasons for picking one or the other and maybe mention certification for applications I don't and probably will never have for the workstation GPU.

Obviously I'm running linux and would stick with the open source drivers for an easier life when updating kernels (I frequently build newer kernels than my distro). It seems, feature wise, that I get some more sane socketry in general on workstation cards e.g. all display port rather than a mixture of display port, HDMI and DVI-D on a workstation card and maybe ECC memory and double precision maths, but at a cost of half the memory.

If a newer GPU makes Virtual Box / Qemu desktops more responsive that is a win, but I'm not sure that it will? I tried and failed with GPU passthrough.

So I wonder which is better, or given my usage does it not really matter? The other plan is just to by the cheapest card on eBay that supports the resolution and has sufficient connectivity.
 
Old 06-20-2020, 11:00 AM   #2
EdGr
Member
 
Registered: Dec 2010
Location: California, USA
Distribution: I run my own OS
Posts: 982

Rep: Reputation: 465Reputation: 465Reputation: 465Reputation: 465Reputation: 465
I Googled both of the cards you are considering.

The Radeon RX570 is much better than the Radeon Pro WX 2100. The former has a 256-bit memory interface and consumes 150W, which are typical of enthusiast graphics cards. I aim for those kind of specs. The latter is very entry-level: 64-bit memory interface and 35W power consumption.

BTW, my six-year-old Radeon R7 265 drives my 4K monitor very well for desktop use. Desktop apps tend to do rendering on the CPU.
Ed
 
1 members found this post helpful.
Old 06-20-2020, 09:46 PM   #3
obobskivich
Member
 
Registered: Jun 2020
Posts: 596

Rep: Reputation: Disabled
Given your usage, I would say 'it does not matter' - with one caveat (will get to it later). Basically, get the Radeon or whatever consumer card satisfies your needs (nVidia GeForce/Intel GMA also included here vs Quadro/Tesla/Radeon Pro/FireGL/etc) - you aren't mentioning anything that's really tied to the 'pro' card and really I don't see any reason to spend the extra money. If you're looking at ebay and finding some pro cards available super-duper cheap, that's another story, but bear in mind they may also be super-duper old (Wikipedia maintains lists of all graphics cards made by AMD and nVidia, and you can use this to line up a 'pro' card to its 'consumer' equivalent (which is very helpful with AMD's completely insane naming schema for their 'pro' cards).

As far as 4K output/video outputs in general, it depends on what the monitor/display supports - the reason for newer consumer cards to have HDMI is for HDMI 2.0, which supports 4K60 among other resolutions (essentially 'just as good' as DisplayPort apart from MST, which almost nothing uses). In my experience, I'd rather have an HDMI/DVI port available, as DP cabling tends to be all over the place (amazing what an 'open' and 'royalty free' standard enables...) while HDMI at least has the 'certified' badging on nicer cables (what I mean specifically here is: with DP cabling I've played roulette a few times just to find a cable that will do 4K60 or 1080p144 or similar, even from the same mfgr, because there's essentially no enforcement so you get a lot of junk - with HDMI there is a lot of junk, but there's also 'HDMI certified' cables that will do what they say (yes they do cost more) and sometimes that's an easier way to go). Finally, consider the kind of display you're using: a lot of modern 4K TVs do a great job as monitors (43" is a good size, for example), but will generally only accept HDMI 2.0, not DP 1.4. A lot of 4K 'monitors' offer a range of connections, not just DP. DP->something converters can also be a grab bag, so having a proper DVI or HDMI (they can convert between themselves passively, but you cannot get HDMI 2.0 from a DVI port, or HDMI 2.0 features over a DVI connector) can be useful there as well. No modern card does analog outputs, so if you need VGA for whatever, you'll need to get a somewhat older card (there are still new enough options that are 'modern' in terms of drivers and performance, but they'll be a few years old).

Now regarding the above note:
- ECC memory on the graphics card, I'd basically say 'so what?' and move on unless you have some use-case that you can argue with me about (in other words, 'if you have to ask, you probably don't need it...' applies here).
- FP64 is supported everywhere, but what you're probably thinking about is 'does it run with decent performance' and the answer is basically 'no, not on anything new' because both nVidia and AMD have figured out they can use that as a segmentation feature, and chain it to their 'compute accelerator' products (which have great big pricetags) and the scientific/industrial users will fork over the cash. So if you need FP64 in OpenCL or CUDA or whatever, you can run it on your GeForce/Quadro/Radeon/Radeon Pro/FireGL/etc but if you want 1/2 or 1/3 FP64 performance you will need either A) an older card or B) a compute accelerator. For example: the original nVidia GeForce Titan cards did 1/2 FP64, as did some of the early GCN Radeon Pro and Radeon R9 stuff (some of those were 1/3 or 1/4 but still a lot better than modern 1/16 or 1/32 etc).
- Modern 'pro cards' are kind of an interesting bird - they don't really have the compute features anymore, because that's been segmented out to compute accelerators, and they're generally pretty identical to the 'consumer' cards apart from their drivers. But in Linux, you aren't usually getting the fullblown driver application with all the ISV-certifications and whatnot that you get in Windows. The one big exception here, which I only know to be 'for certain' with nVidia, is that Quadro cards will enable Mosaic (both in Linux and Windows), which allows for GPU passthru and other things, while the GeForce card will actively try to block this functionality. If memory serves, AMD has not quite 'caught up' here, so Radeon should work more properly, but I can't say that for certain (or across all generations).

Some generalized hardware recommendations, assuming you have both an x16 slot, and sufficient power:
- For 4K output/support you will need either an nVidia Maxwell 2 (GeForce 900 series) or Radeon GCN 3 (Radeon Fury series) or newer. Some older cards (especially on the AMD side) can support 4K via HDMI 1.4 or DP1.x, but will not support HEVC/h265, and their internal video scalers cannot work with 4K (e.g. Radeon R9 290X can do 4K via HDMI 1.4 and DP1.x, but cannot scale video/rendering to full 4K, instead its limited to something like 3200x1800 internally - it also has no HEVC support).
- There are a lot of 'cheap' or 'entry level' boards that will fit into the above and that will work fine in Linux (and/or Windows, if you ever need). I know Radeon has open source drivers, but I also know there tends to be a nasty lag between new hardware and full driver support (can be many months) - you can go dig through the 5700XT launch for an example. This isn't to say Radeon is bad, just bear in mind that open source means open source, for better or worse. By contrast, the nVidia drivers tend to 'just work' if you can get them installed.

Having gone from a 2-3 monitor configuration (none higher than 1080p) to a single 4K 43" and living with both for at least a year, I'm back to a multi-head configuration with smaller monitors. Playing 'window tetris' just isn't for me. I personally cannot stand 'small' 4K monitors (e.g. 28" 4K) and I just don't understand their appeal, so I can't give you any guidance there.

As far as a specific card to look for, since you don't really indicate you're doing much gaming or other 3D work, I'd probably go along with 'cheapest card on ebay that supports 4K and HEVC' as a goal - look for something from the Rx 4xx or Rx 5xx (Polaris) lines, or maybe even a Fury or Vega. If nVidia is able to be in the consideration, there's usually a lot of cheap GeForce 900s available - something like GTX 950 should be enough.

To EdGr's point: I agree. If you want a Polaris 'pro' card you'd need to move up the line, like WX 7100. That said, if you aren't really doing gaming or heavy 3D work, why spend the money or deal with the size/cooling/power requirements of the beefier GPU? Generally the 'latest and greatest' in terms of video support, output, etc is available on more 'entry level' parts these days (e.g. GeForce GT 1030 is more or less a match for GTX 1080, until you fire up a game or CUDA app).
 
1 members found this post helpful.
Old 06-21-2020, 06:48 AM   #4
petejc
Member
 
Registered: Apr 2019
Distribution: Slackware
Posts: 134

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by EdGr View Post
I Googled both of the cards you are considering.

The Radeon RX570 is much better than the Radeon Pro WX 2100. The former has a 256-bit memory interface and consumes 150W, which are typical of enthusiast graphics cards. I aim for those kind of specs. The latter is very entry-level: 64-bit memory interface and 35W power consumption.
Thanks. It looks like I need a specific niche use to warrant the Pro card. My only other thought was power draw as this box is on 24/7. However, from further googling it seems that cards control their idle power draw such that I should not be concerned.


Quote:
BTW, my six-year-old Radeon R7 265 drives my 4K monitor very well for desktop use. Desktop apps tend to do rendering on the CPU.
Ed
That is interesting. My old machine which I mainly use to main machine has, I think, a RX440. However, 'RX440' does not seem to come up in searches. It's not showing on the PCI bus at the moment. I'll need to investigate as that might do the job. However, the one in this machine does not appear to be up to it:
0a:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Turks PRO [Radeon HD 6570/7570/8550]
 
Old 06-21-2020, 06:49 AM   #5
petejc
Member
 
Registered: Apr 2019
Distribution: Slackware
Posts: 134

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by obobskivich View Post
Given your usage, I would say 'it does not matter' - with one caveat (will get to it later)...
Wow! I'll respond appropriately after I've digested this a little more (and digested some lunch).
 
Old 06-21-2020, 08:46 AM   #6
EdGr
Member
 
Registered: Dec 2010
Location: California, USA
Distribution: I run my own OS
Posts: 982

Rep: Reputation: 465Reputation: 465Reputation: 465Reputation: 465Reputation: 465
Quote:
Originally Posted by petejc View Post
That is interesting. My old machine which I mainly use to main machine has, I think, a RX440. However, 'RX440' does not seem to come up in searches. It's not showing on the PCI bus at the moment. I'll need to investigate as that might do the job. However, the one in this machine does not appear to be up to it:
0a:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Turks PRO [Radeon HD 6570/7570/8550]
That card has PCIe 2.0. Check whether your motherboard supports PCIe 3.0. If not, you may be better off getting a new computer.
Ed
 
1 members found this post helpful.
Old 06-21-2020, 09:26 AM   #7
petejc
Member
 
Registered: Apr 2019
Distribution: Slackware
Posts: 134

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by EdGr View Post
That card has PCIe 2.0. Check whether your motherboard supports PCIe 3.0. If not, you may be better off getting a new computer.
Ed
I updated the machine a bit over a year ago, my motherboard is an X470, so that is fine.
 
Old 06-21-2020, 11:01 AM   #8
petejc
Member
 
Registered: Apr 2019
Distribution: Slackware
Posts: 134

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by obobskivich View Post
Given your usage, I would say 'it does not matter' - with one caveat (will get to it later). Basically, get the Radeon or whatever consumer card satisfies your needs (nVidia GeForce/Intel GMA also included here vs Quadro/Tesla/Radeon Pro/FireGL/etc) - you aren't mentioning anything that's really tied to the 'pro' card and really I don't see any reason to spend the extra money. If you're looking at ebay and finding some pro cards available super-duper cheap, that's another story, but bear in mind they may also be super-duper old (Wikipedia maintains lists of all graphics cards made by AMD and nVidia, and you can use this to line up a 'pro' card to its 'consumer' equivalent (which is very helpful with AMD's completely insane naming schema for their 'pro' cards).
On that basis it is hard to buy the 'wrong' card in that a large number should be more than adequate for my needs, but I could spend money needlessly.

Quote:
As far as 4K output/video outputs in general, it depends on what the monitor/display supports - the reason for newer consumer cards to have HDMI is for HDMI 2.0, which supports 4K60 among other resolutions (essentially 'just as good' as DisplayPort apart from MST, which almost nothing uses). In my experience, I'd rather have an HDMI/DVI port available, as DP cabling tends to be all over the place (amazing what an 'open' and 'royalty free' standard enables...) while HDMI at least has the 'certified' badging on nicer cables (what I mean specifically here is: with DP cabling I've played roulette a few times just to find a cable that will do 4K60 or 1080p144 or similar, even from the same mfgr, because there's essentially no enforcement so you get a lot of junk - with HDMI there is a lot of junk, but there's also 'HDMI certified' cables that will do what they say (yes they do cost more) and sometimes that's an easier way to go). Finally, consider the kind of display you're using: a lot of modern 4K TVs do a great job as monitors (43" is a good size, for example), but will generally only accept HDMI 2.0, not DP 1.4. A lot of 4K 'monitors' offer a range of connections, not just DP. DP->something converters can also be a grab bag, so having a proper DVI or HDMI (they can convert between themselves passively, but you cannot get HDMI 2.0 from a DVI port, or HDMI 2.0 features over a DVI connector) can be useful there as well. No modern card does analog outputs, so if you need VGA for whatever, you'll need to get a somewhat older card (there are still new enough options that are 'modern' in terms of drivers and performance, but they'll be a few years old).
OK, thank you regarding HDMI 2.0 versus DP. A good point. I had been concerned that if I went dual monitor I might end up with one running of DP and another off of HDMI or DVI-D to the detriment of that monitor. Also it seemed untidy. I'm currently running 2x 1280x1024 displays. One (the new one) is a LCD but the other is a CRT. But I'd never upgraded as I usually wait for things to break and also I'm noticing that I'm running out of vertical resolution/room at 1024. Though there have been lots of good value full HD monitor options they don't offer any significant advantage from the vertical dimension.

I did think at one point that a single 4k monitor at 32" was a good choice, it would give me a lot more screen area from a pixel perspective and about 50% greater resolution than I have now. But it looks like 27-28" is the sweet spot regarding value for 4k. I was considering whether a 1440p ultrawide would be a good move, I can basically match the horizontal resolution that I have with two monitors with 40% more height. That would be a step up. Back to 4K, at 27-28" I have enough physical space if I decide I do need to go back to dual display setup. I'm not convinced that a 43" monitor/TV is a good choice for my setup.

Quote:
Now regarding the above note:
- ECC memory on the graphics card, I'd basically say 'so what?' and move on unless you have some use-case that you can argue with me about (in other words, 'if you have to ask, you probably don't need it...' applies here).
- FP64 is supported everywhere, but what you're probably thinking about is 'does it run with decent performance' and the answer is basically 'no, not on anything new' because both nVidia and AMD have figured out they can use that as a segmentation feature, and chain it to their 'compute accelerator' products (which have great big pricetags) and the scientific/industrial users will fork over the cash. So if you need FP64 in OpenCL or CUDA or whatever, you can run it on your GeForce/Quadro/Radeon/Radeon Pro/FireGL/etc but if you want 1/2 or 1/3 FP64 performance you will need either A) an older card or B) a compute accelerator. For example: the original nVidia GeForce Titan cards did 1/2 FP64, as did some of the early GCN Radeon Pro and Radeon R9 stuff (some of those were 1/3 or 1/4 but still a lot better than modern 1/16 or 1/32 etc).
- Modern 'pro cards' are kind of an interesting bird - they don't really have the compute features anymore, because that's been segmented out to compute accelerators, and they're generally pretty identical to the 'consumer' cards apart from their drivers. But in Linux, you aren't usually getting the fullblown driver application with all the ISV-certifications and whatnot that you get in Windows. The one big exception here, which I only know to be 'for certain' with nVidia, is that Quadro cards will enable Mosaic (both in Linux and Windows), which allows for GPU passthru and other things, while the GeForce card will actively try to block this functionality. If memory serves, AMD has not quite 'caught up' here, so Radeon should work more properly, but I can't say that for certain (or across all generations).
I spent likely too much on my motherboard to get one that appeared to work well with IOMMU so I could do GPU passthrough. However, when I tried that one year I found it, and two GPUs in general in the same machine to be a pain. For reasons unknown, likely trying to recover from a machine that would not boot, or booted with no usable graphics, I corrupted root and it took a while to recover that. No data was lost owing to backups, but my setup is fairly complex and that was not backed up so well. So I decided that playing around on my main machine with GPU passthrough was not something I wanted to play with on my main machine.

Quote:
Some generalized hardware recommendations, assuming you have both an x16 slot, and sufficient power:
- For 4K output/support you will need either an nVidia Maxwell 2 (GeForce 900 series) or Radeon GCN 3 (Radeon Fury series) or newer. Some older cards (especially on the AMD side) can support 4K via HDMI 1.4 or DP1.x, but will not support HEVC/h265, and their internal video scalers cannot work with 4K (e.g. Radeon R9 290X can do 4K via HDMI 1.4 and DP1.x, but cannot scale video/rendering to full 4K, instead its limited to something like 3200x1800 internally - it also has no HEVC support).
- There are a lot of 'cheap' or 'entry level' boards that will fit into the above and that will work fine in Linux (and/or Windows, if you ever need). I know Radeon has open source drivers, but I also know there tends to be a nasty lag between new hardware and full driver support (can be many months) - you can go dig through the 5700XT launch for an example. This isn't to say Radeon is bad, just bear in mind that open source means open source, for better or worse. By contrast, the nVidia drivers tend to 'just work' if you can get them installed.
That is good stuff to know. I was looking at AMD as I run Slackware, so proprietary drivers are pretty much left to the user. I also use btrfs so I like to build my own kernels to keep a fairly recent kernel for that (recommended). Proprietary graphic drivers seem to add yet another layer of complexity to kernel builds.

Quote:
Having gone from a 2-3 monitor configuration (none higher than 1080p) to a single 4K 43" and living with both for at least a year, I'm back to a multi-head configuration with smaller monitors. Playing 'window tetris' just isn't for me. I personally cannot stand 'small' 4K monitors (e.g. 28" 4K) and I just don't understand their appeal, so I can't give you any guidance there.
I've sort of gone into this previously. 1080 vertical resolution is insufficient, so I'm likely looking at 1440 vertical resolution versus 4k if I don't got for the latter. 1600 vertical resolution does not give you good value if price is a variable. What resolution are you at with your dual monitor setup? If a dual 1440 set up is fine for me (and we may differ in usage and views) then if a cheap or even existing GPU supports dual 1440 that might be no more expensive than a 4k and a new GPU.

I can't remember exactly what the card in this machine is ( I have two, note) but lspci shows
VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Turks PRO [Radeon HD 6570/7570/8550]

It seems that supports
DVI: 2560 x 1600 / DisplayPort: 2560 x 1600 / VGA: 2048 x 1536

(https://www.cnet.com/products/amd-ra...0-1-gb-series/)

Mine is a 2 DP, 1xDVI model. (with analog VGA on the DVI.)


Quote:
As far as a specific card to look for, since you don't really indicate you're doing much gaming or other 3D work, I'd probably go along with 'cheapest card on ebay that supports 4K and HEVC' as a goal - look for something from the Rx 4xx or Rx 5xx (Polaris) lines, or maybe even a Fury or Vega. If nVidia is able to be in the consideration, there's usually a lot of cheap GeForce 900s available - something like GTX 950 should be enough.
Not much gaming or 3d, though in some ways this is a little chicken and egg, as this machine is not really set up for gaming.
Thanks for that, really useful. I'd not even thought of HEVC as being relevant so that is useful. I was aiming for AMD as the open source drivers make sense, I understand. Proprietary drivers just add complexity when updating the kernel. I usually update kernels myself rather than use the distro one.


Quote:
To EdGr's point: I agree. If you want a Polaris 'pro' card you'd need to move up the line, like WX 7100. That said, if you aren't really doing gaming or heavy 3D work, why spend the money or deal with the size/cooling/power requirements of the beefier GPU? Generally the 'latest and greatest' in terms of video support, output, etc is available on more 'entry level' parts these days (e.g. GeForce GT 1030 is more or less a match for GTX 1080, until you fire up a game or CUDA app).
I really was wondering whether I was missing something, as you could say my system was a tad umbalanced with
a 2700X processor and a new old stock 10 year old workstaton GPU. But then it depends what I am using it for. If I was missing something on desktop acceleration that I simply had not noticed owing to generally using older hardware on the graphics side.

With a WX7100 being £625 here (I only looked on one website) it would have to make my life very much better in some way. I can't think what that way would be. Certainly I can't think how I could use it for the workloads I would like to speed up.
 
Old 06-21-2020, 01:14 PM   #9
petejc
Member
 
Registered: Apr 2019
Distribution: Slackware
Posts: 134

Original Poster
Rep: Reputation: Disabled
RX460 RIP

I can count my existing RX460 (on my old PC) out of the equation. It appears completely dead and I am not going to risk installing that GPU in my good machine and risk that.
 
Old 06-21-2020, 01:56 PM   #10
EdGr
Member
 
Registered: Dec 2010
Location: California, USA
Distribution: I run my own OS
Posts: 982

Rep: Reputation: 465Reputation: 465Reputation: 465Reputation: 465Reputation: 465
Quote:
Originally Posted by petejc
I did think at one point that a single 4k monitor at 32" was a good choice, it would give me a lot more screen area from a pixel perspective and about 50% greater resolution than I have now. But it looks like 27-28" is the sweet spot regarding value for 4k.
I have 28" 4K. 28" offers a good size and viewing distance for a desk. It can display two pages side-by-side with everything readable and no scrolling. After getting used to 4K, I find 1080p monitors to be cramped and blocky.

Quote:
Originally Posted by petejc
I was looking at AMD as I run Slackware, so proprietary drivers are pretty much left to the user. I also use btrfs so I like to build my own kernels to keep a fairly recent kernel for that (recommended). Proprietary graphic drivers seem to add yet another layer of complexity to kernel builds.
I do that too. Proprietary drivers are out-of-the-question. Fortunately, AMD's open-source drivers work really well.
Ed

Last edited by EdGr; 06-21-2020 at 02:22 PM.
 
1 members found this post helpful.
Old 06-22-2020, 03:11 AM   #11
mrmazda
LQ Guru
 
Registered: Aug 2016
Location: SE USA
Distribution: openSUSE 24/7; Debian, Knoppix, Mageia, Fedora, others
Posts: 5,788
Blog Entries: 1

Rep: Reputation: 2065Reputation: 2065Reputation: 2065Reputation: 2065Reputation: 2065Reputation: 2065Reputation: 2065Reputation: 2065Reputation: 2065Reputation: 2065Reputation: 2065
The newest of these three is 7 years old:
Code:
# inxi -SGIxx
System:    Host: ab85m Kernel: 5.6.18-300.fc32.x86_64 x86_64 bits: 64 compiler: gcc v: 10.1.1 Desktop: KDE Plasma 5.18.5 
           tk: Qt 5.14.2 wm: kwin_x11 dm: LightDM Distro: Fedora release 32 (Thirty Two) 
Graphics:  Device-1: Intel Xeon E3-1200 v3/4th Gen Core Processor Integrated Graphics vendor: ASUSTeK driver: i915 v: kernel 
           bus ID: 00:02.0 chip ID: 8086:0402 
           Display: x11 server: Fedora Project X.org 1.20.8 compositor: kwin_x11 driver: modesetting unloaded: fbdev,vesa 
           resolution: 1: 2560x1440~60Hz 2: 2560x1080~60Hz s-dpi: 120 
           OpenGL: renderer: Mesa DRI Intel HD Graphics (HSW GT1) v: 4.5 Mesa 20.0.7 compat-v: 3.0 direct render: Yes 
Info:      ...Shell: bash v: 5.0.17 running in: konsole inxi: 3.1.03 
# xrandr | egrep 'onnect|creen|\*' | grep -v disconn | sort -r
Screen 0: minimum 320 x 200, current 2560 x 2520, maximum 16384 x 16384
HDMI-3 connected 2560x1080+0+0 (normal left inverted right x axis y axis) 673mm x 284mm
DP-1 connected primary 2560x1440+0+1080 (normal left inverted right x axis y axis) 598mm x 336mm
   2560x1440     59.95*+  74.92  
   2560x1080     60.00*+

# inxi -SGIxx
System:    Host: gx78b Kernel: 5.5.13-1-default x86_64 bits: 64 compiler: gcc v: 9.2.1 Desktop: Trinity R14.0.7 tk: Qt 3.5.0
           wm: Twin dm: startx Distro: openSUSE Tumbleweed 20200523
Graphics:  Device-1: Advanced Micro Devices [AMD/ATI] Caicos [Radeon HD 6450/7450/8450 / R5 230 OEM] vendor: Dell
           driver: radeon v: kernel bus ID: 01:00.0 chip ID: 1002:6779
           Display: x11 server: X.Org 1.20.8 driver: modesetting unloaded: fbdev,vesa alternate: ati resolution:
           1: 2560x1440~60Hz 2: 1920x1200~60Hz s-dpi: 120
           OpenGL: renderer: AMD CAICOS (DRM 2.50.0 / 5.5.13-1-default LLVM 10.0.0) v: 3.3 Mesa 20.0.7 compat-v: 3.1
           direct render: Yes
Info:      ... Shell: bash v: 5.0.17 running in: konsole inxi: 3.1.03
# xrandr | egrep 'onnect|creen|\*' | grep -v disconn | sort -r
Screen 0: minimum 320 x 200, current 2560 x 2640, maximum 16384 x 16384
DVI-I-1 connected 1920x1200+0+0 (normal left inverted right x axis y axis) 519mm x 324mm
DP-1 connected primary 2560x1440+0+1200 (normal left inverted right x axis y axis) 598mm x 336mm
   2560x1440     59.95*+  74.92
   1920x1200     59.95*+

# inxi -SGIxx
System:    Host: p5bse Kernel: 4.15.0-106-generic x86_64 bits: 64 compiler: gcc v: 7.5.0 Desktop: Trinity R14.0.8 tk: Qt 3.5.0
           wm: Twin dm: TDM Distro: Ubuntu 18.04.4 LTS (Bionic Beaver)
Graphics:  Device-1: NVIDIA GF119 [NVS 310] vendor: Hewlett-Packard driver: nouveau v: kernel bus ID: 01:00.0
           chip ID: 10de:107d
           Display: server: X.Org 1.19.6 driver: modesetting alternate: fbdev,nouveau,vesa resolution: 1: 2560x1440~60Hz
           2: 1920x1200~60Hz s-dpi: 120
           OpenGL: renderer: NVD9 v: 4.3 Mesa 19.2.8 direct render: Yes
Info:      ...Shell: bash v: 4.4.20 running in: konsole inxi: 3.1.03
# xrandr | egrep 'onnect|creen|\*' | grep -v disconn | sort -r
Screen 0: minimum 320 x 200, current 2560 x 2640, maximum 16384 x 16384
DP-2 connected 1920x1200+0+0 (normal left inverted right x axis y axis) 519mm x 324mm
DP-1 connected primary 2560x1440+0+1200 (normal left inverted right x axis y axis) 598mm x 336mm
   2560x1440     59.95*+  74.92
   1920x1200     59.95*+

# inxi -SGIxx
System:    Host: big31 Kernel: 3.16.0-10-amd64 x86_64 bits: 64 compiler: gcc v: 4.9.2 Desktop: Cinnamon 3.4.6 wm: muffin 
           dm: LightDM Distro: LMDE 3 Cindy base: Debian 9.3 stretch 
Graphics:  Device-1: Advanced Micro Devices [AMD/ATI] RV620 PRO [Radeon HD 3470] vendor: Dell driver: radeon v: kernel 
           bus ID: 01:00.0 chip ID: 1002:95c0 
           Display: x11 server: X.Org 1.16.4 driver: modesetting resolution: 1: 2560x1440~60Hz 2: 2560x1080~60Hz s-dpi: 96 
           OpenGL: renderer: Gallium 0.4 on llvmpipe (LLVM 3.9 128 bits) v: 3.0 Mesa 13.0.6 direct render: Yes 
Info:      ...Shell: bash v: 4.4.12 running in: gnome-terminal inxi: 3.1.03 
# xrandr | egrep 'onnect|creen|\*' | grep -v disconn | sort -r
Screen 0: minimum 320 x 200, current 5120 x 1440, maximum 8192 x 8192
DisplayPort-1 connected 2560x1080+2560+0 673mm x 284mm
DisplayPort-0 connected primary 2560x1440+0+0 598mm x 336mm
   2560x1440     59.95*+  74.92  
   2560x1080     60.00*+
IOW, cheap old stuff works with FOSS and 1440 vertical resolution, unless maybe you're a serious gamer or video editor. The ATI & NVidia graphics cards represented here came from eBay for under $20USD each.

Note all four are running the upstream default DDX, modesetting.

My GeForce 210 that I paid $30 for brand new, with VGA, HDMI & DVI outputs, only supports 1440 via dual-link DVI (or VGA?), max 1920x1200 on HDMI. It's why I started shopping GPUs on eBay instead of new.

I bought a 43" 4k LG TV once. It was capable of outputting from 4 DP/HDMI inputs simultaneously. I returned it to vendor after about 3 weeks, for several different reasons, including that HiDPI is a PITA that is readily avoided with 2K displays, but not so easy with smaller 4k screen sizes.

HTH helps you save money.
 
1 members found this post helpful.
Old 06-22-2020, 03:20 AM   #12
obobskivich
Member
 
Registered: Jun 2020
Posts: 596

Rep: Reputation: Disabled
Quote:
Originally Posted by petejc View Post
Thanks. It looks like I need a specific niche use to warrant the Pro card. My only other thought was power draw as this box is on 24/7. However, from further googling it seems that cards control their idle power draw such that I should not be concerned.
Modern graphics cards tend to 'idle down' when not doing much work, but I would still steer you towards something that uses the lowest overall wattage (TDP) as will suit your requirements. Some real example: my GeForce GTX 1080 is reporting (thru NVIDIA-SMI) 48W at 'idle' (1% GPU-util and 752 MiB memory util (about 1/8th)) - this is driving four monitors simultaneously with no video/3D applications (I'm running XFCE with compositing 'on' - if I cut that back it would reduce memory usage and probably some power draw). By contrast something like a GeForce GT 610 (which can only drive two monitors, generally no 4K, etc) is like 25W for the whole board. My point here is basically 'yes you're right' but also there's no need to spend extra (either up-front purchase price or power budget) if you are never tapping into the performance. Hopefully that makes sense.

Also note that this is a relatively recent feature for GPUs to be this power friendly - even going back a few generations it isn't uncommon to find 100W+ idle states, so bear this in mind if you go looking around on ebay for a bargain.



Quote:
That is interesting. My old machine which I mainly use to main machine has, I think, a RX440. However, 'RX440' does not seem to come up in searches. It's not showing on the PCI bus at the moment. I'll need to investigate as that might do the job. However, the one in this machine does not appear to be up to it:
0a:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Turks PRO [Radeon HD 6570/7570/8550]
AMD's naming scheme for graphics cards probably makes sense to someone who works at AMD, but otherwise is entirely arbitrary, especially in recent years - Radeon used to at least make some sense, but once they went to the R-something ### scheme it went crazy. The pro cards have never made sense between generations. Your best bet is to take that 'codename' ('Turks') and look for that - it probably exists as multiple separate SKUs across multiple 'generations' in a variety of memory sizes and output port configurations. An example off the top of my head is 'Hawaii' - which was sold as R9 290, 290X, 390, 390X, 295X2, 390X2, 395X2, along with some pro cards to boot. By contrast, R9 285 is actually a newer GPU (generationally) with better video decode, 4K support, etc but slightly lower performance. Makes total sense, right?

This isn't to say nVidia is 'much better' but there's at least a bit more logic within a single generation (e.g. GTX 1080 is safely assumed 'better' than GTX 1070, and GTX 980 is safely assumed to be a different generation). Wikipedia can be your friend here:
https://en.wikipedia.org/wiki/List_o...ocessing_units (copious use of Ctrl-F will help).

Quote:
Originally Posted by petejc View Post
On that basis it is hard to buy the 'wrong' card in that a large number should be more than adequate for my needs, but I could spend money needlessly.
Right - the biggest 'gotcha' is 4K support. If that were off the table, you would have a lot more options from recent but somewhat more dated cards, especially entry-mid level stuff, but full 4K support is a relatively new feature for both nVidia and AMD. Bear in mind, there are some 'in-between' cards that can do 4K output, but don't support HEVC, don't support HDMI 2.0, or have some other catch that I'm explicitly not counting as 'supported' here (R9 290X being an example, as above) for simplicity's sake. If you know for certain that you don't care about h265/HEVC support, will never use HDMI 2.0 (or are fine dealing with adapters), etc that opens you up a little bit more, but cards in this category are usually older high-end models, so power draw should be considered.

Quote:
OK, thank you regarding HDMI 2.0 versus DP. A good point. I had been concerned that if I went dual monitor I might end up with one running of DP and another off of HDMI or DVI-D to the detriment of that monitor.
I've never had issues with any graphics card running 'mixed' outputs like that - my Radeon and GeForce cards do it regularly, without issue. This has been my experience for at least the last 10 years - the only 'conditionals' on this are cards that offered TV out (e.g. S-Video or YPbPr) could be weird if that was one of the monitors in a multi-head mode, and some of the Matrox cards from ~2000 tended to want all the connections the same. But that's ancient history.

One thing to consider: I'm assuming you're going 100% digital connections here - DVI, HDMI, or DP in any combination. All of these connections on modern displays and graphics cards will support HDCP. VGA and other analog connections will not support HDCP, which can cause problems with some DRM protected content (e.g. online streaming video), regardless of the GPU supporting the feature. If HDCP does not matter for you, this distinction does not matter.

Quote:
Also it seemed untidy. I'm currently running 2x 1280x1024 displays. One (the new one) is a LCD but the other is a CRT. But I'd never upgraded as I usually wait for things to break and also I'm noticing that I'm running out of vertical resolution/room at 1024. Though there have been lots of good value full HD monitor options they don't offer any significant advantage from the vertical dimension.
Generally I get where you're coming from - most LCDs over the last 15 years are somewhere between 1440x900 and 1920x1080 - 2560x1600 and 2560x1440 are somewhat 'exotic' (although prices have come down a lot in the last few years), and 4K even moreso. Have you ever considered having 'stacked' monitors or using a monitor in portrait mode, if you need vertical resolution? A 1080p monitor on its side is almost 2000px tall, and makes a *great* reading display as long as the viewing angles don't go nuts when you rotate it.

Quote:
I did think at one point that a single 4k monitor at 32" was a good choice, it would give me a lot more screen area from a pixel perspective and about 50% greater resolution than I have now. But it looks like 27-28" is the sweet spot regarding value for 4k. I was considering whether a 1440p ultrawide would be a good move, I can basically match the horizontal resolution that I have with two monitors with 40% more height. That would be a step up. Back to 4K, at 27-28" I have enough physical space if I decide I do need to go back to dual display setup. I'm not convinced that a 43" monitor/TV is a good choice for my setup.
Just my opinion/experience: I really don't understand the need for pixel pitch below ~.220mm. I have tried most sizes/resolutions of monitors over the years - I've had 30" 1600p, 23" DCI 2K (2048x1152), 43" 4K, CRTs that go all the way up to QXGA, etc and I consistently find myself coming back to multi-monitor 24" 1080p displays. The ~.250mm pixel pitch will get you identical sizing to a 20.1" 1600x1200 monitor, and thats roughly what translates up to the 43" 4K. At the 27-28" you're getting closer to IBM T220 density, which again, I just do not personally understand the need for; if it works for you, great, there's a lot more affordable options than just a few years ago!

I would pass on the ultrawide or super-ultrawide (21:9 or >21:9) monitors, and here's why: basically nothing is produced for that form factor. So any video will be pillar boxed or window boxed, and many games will even have issues at that resolution. If you're just running X applications (like browsers, word processors, terminal windows, etc) the price premium doesn't seem worthwhile vs getting 2+ standard monitors and having them side by side, because you're going to be playing window tetris to size things out. Basically, I would avoid things that aren't 'standard' - 1080p, 4K, and to a lesser extent 1440p are all pretty common, and therefore will be nicely compatible with lots of stuff, even if you're scaling (e.g. 1080p video scales nicely to 4K).

Quote:

I spent likely too much on my motherboard to get one that appeared to work well with IOMMU so I could do GPU passthrough. However, when I tried that one year I found it, and two GPUs in general in the same machine to be a pain. For reasons unknown, likely trying to recover from a machine that would not boot, or booted with no usable graphics, I corrupted root and it took a while to recover that. No data was lost owing to backups, but my setup is fairly complex and that was not backed up so well. So I decided that playing around on my main machine with GPU passthrough was not something I wanted to play with on my main machine.
I've honestly never gotten GPU passthru to work myself, and from reading about it, it looks like a royal pain to deal with. Based on said reading is why I can tell you about the nVidia limits - I don't know if Radeon (the driver) does or does not impose those limits, but that would be worth investigating if this is a path you want to undertake. My understanding is this 'works easier' with official support in the card firmware/driver, as with Quadro (and I assume FireGL/Radeon Pro).

Quote:
That is good stuff to know. I was looking at AMD as I run Slackware, so proprietary drivers are pretty much left to the user. I also use btrfs so I like to build my own kernels to keep a fairly recent kernel for that (recommended). Proprietary graphic drivers seem to add yet another layer of complexity to kernel builds.
Admittedly I've never attempted nVidia drivers in Slackware - I have seen there is a package from Slackbuilds but as I said I've never tried it. See here: https://slackbuilds.org/repository/1...nvidia-driver/

FWIW the version # looks pretty up to date - my Xubuntu install shows 440.64.

Quote:
I've sort of gone into this previously. 1080 vertical resolution is insufficient, so I'm likely looking at 1440 vertical resolution versus 4k if I don't got for the latter. 1600 vertical resolution does not give you good value if price is a variable. What resolution are you at with your dual monitor setup? If a dual 1440 set up is fine for me (and we may differ in usage and views) then if a cheap or even existing GPU supports dual 1440 that might be no more expensive than a 4k and a new GPU.
My desk has 6 1080p monitors (technically 5x 1080p and 1x 2K, but close enough) in a 3 wide x 2 high configuration. This replaced a single 43" 4K monitor, and is effectively half-over again the total pixels. I do not have a single machine connected to all 6 displays presently, and usually only have the bottom 3 active for 'normal use' (because I just don't need the extra space a lot of the time), but if I have multiple machines running its easier to be able to 'break up' my display space with all the separate monitors.

I fully agree on 2560x1600 monitors being insanely expensive - I have no idea why the prices did not/have not fallen there. They were $1200-1600 (USD) in 2008, and that's roughly what they cost today. I don't understand it, especially when you can get a 1440p monitor for around a third of that.

As far as <4K - most graphics cards back to at least the mid-2000s will support 1600p on down, so that may be easier to achieve, but I wouldn't explicitly rule out 4K support (or HEVC support) unless your budget is under $20.

Quote:
I can't remember exactly what the card in this machine is ( I have two, note) but lspci shows
VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Turks PRO [Radeon HD 6570/7570/8550]

It seems that supports
DVI: 2560 x 1600 / DisplayPort: 2560 x 1600 / VGA: 2048 x 1536

(https://www.cnet.com/products/amd-ra...0-1-gb-series/)

Mine is a 2 DP, 1xDVI model. (with analog VGA on the DVI.)
That's pre-GCN, which is pretty dated - you can probably do better for not too many bucks if you're okay buying used hardware. Something else to note: a lot of newer cards support >2 monitors, so you may not need 2 cards depending on the monitor count.


Quote:
Not much gaming or 3d, though in some ways this is a little chicken and egg, as this machine is not really set up for gaming.
Thanks for that, really useful. I'd not even thought of HEVC as being relevant so that is useful. I was aiming for AMD as the open source drivers make sense, I understand. Proprietary drivers just add complexity when updating the kernel. I usually update kernels myself rather than use the distro one.
I think any of the 'newer' AMD cards should be a good candidate - Polaris or newer will get you HEVC, 4K support, modern HDMI and DP, and there's a lot of 'cheap' options. If the power isn't a problem (for the system's PSU), something like a Radeon RX 590 would be an easy 'slam dunk' in doing everything you could think up - probably overkill based on what you've described. One of the lower-tier 400/500 series midrange cards should be equivalent (note that I don't know for certain there aren't weird rebrands there - check Wikipedia to be sure) apart from the 3D performance.


Quote:
I really was wondering whether I was missing something, as you could say my system was a tad umbalanced with
a 2700X processor and a new old stock 10 year old workstaton GPU. But then it depends what I am using it for. If I was missing something on desktop acceleration that I simply had not noticed owing to generally using older hardware on the graphics side.

With a WX7100 being £625 here (I only looked on one website) it would have to make my life very much better in some way. I can't think what that way would be. Certainly I can't think how I could use it for the workloads I would like to speed up.
Honestly GPUs have gotten so good over the last few years, that even 'old' cards tend to be just fine unless you need some new feature (usually video decoding or display output). The other side of that coin is 'performance improvements' have been fairly gradual/minimal over the last few generations - GeForce 10 series came out in 2016, and is still 'relevant' in 2020, and the new 20 series is only a slight improvement in overall performance. The same is true for Radeon - 5700XT is only a minimal upgrade over Vega64 or Radeon VII. If you get out of the 'high end' cards, this is even less of a significant discussion because as EdGr pointed out, there is a relationship between power draw and performance, and a lot of the entry-level and mid-range cards target power savings over absolute performance - you can only do so much in a 20-50W TDP card.

Just from some quick looking at Wikipedia, something like the Radeon RX 560 would probably be a good candidate for you to consider - it should offer decent performance, is fairly modern, but not so cutting edge that you have to worry about immature drivers (at least in theory). I think it does 3 outputs total, has HEVC and 4K support, etc. If you need more performance, the RX 590 (as above) would probably be an easy upgrade, and is/was a fairly popular model so you should be able to find a range of them new or used, and with various aftermarket cooling, and documentation from people who have used them in a variety of settings.

Can see more here:
https://www.techpowerup.com/gpu-spec...n-rx-560.c2940
https://www.amd.com/en/products/graphics/radeon-rx-560

The RX 590 will basically be 'more of everything' at higher cost and higher power draw.

If you want a 'pro' card, the WX 4100 (note the X) and WX 3200 look similar to the RX 560 in terms of generation, but both appear to cost quite a bit more (I quickly checked ebay), and in absolute terms are probably slower at some 3D tasks (like games).

Finally, an off-the-wall idea to consider: if your motherboard has video outputs, there are AM4 Ryzen APUs that feature Vega graphics. You could in theory replace the 2700X with one of those, and have an all-in-one solution. I don't know if this would be a good or bad idea because I haven't followed the newer Ryzen chips very closely, so you may lose cores or some other feature to gain the GPU, but I know my older APUs are pretty competent as their own graphics cards (I have an A10-5800k in a desktop, and an A10-9620 in a laptop). One model I know of is the 'Ryzen 3400G.'

Hope some of this at least helps.
 
1 members found this post helpful.
Old 06-22-2020, 06:48 AM   #13
petejc
Member
 
Registered: Apr 2019
Distribution: Slackware
Posts: 134

Original Poster
Rep: Reputation: Disabled
Firstly I'd like to thank yourself, EdGr and MrMazda. This has all been very useful information.

In the end I went with a 4k monitor, iiyama XUB2792UHSU-B1 and an RX590 GPU. I expect whichever way I went it would have been a decent solution. The Iiyama seems very good value. From reviews it appeared that if I wanted better colour fidelity for my photograph, which I don't do much of nowadays, I'd have to spend considerably more and if a gamer I'd have gone for a higher end card, and a faster 1440 monitor, so this seems sensible. I'll find out when it arrives. Regarding the GPU I suspect my RX460 would have been fine were it not dead. I went down the new route as it seems that second hand Polaris GPUs on eBay are barely cheaper than new ones and a RX590 is no dearer than an RX580 and only marginally more expensive than an 8GB RX570. But then I've fallen for the marketers who are pushing you to just spend that little bit more.



Quote:
Originally Posted by obobskivich View Post
Modern graphics cards tend to 'idle down' when not doing much work, but I would still steer you towards something that uses the lowest overall wattage (TDP) as will suit your requirements. Some real example: my GeForce GTX 1080 is reporting (thru NVIDIA-SMI) 48W at 'idle' (1% GPU-util and 752 MiB memory util (about 1/8th)) - this is driving four monitors simultaneously with no video/3D applications (I'm running XFCE with compositing 'on' - if I cut that back it would reduce memory usage and probably some power draw). By contrast something like a GeForce GT 610 (which can only drive two monitors, generally no 4K, etc) is like 25W for the whole board. My point here is basically 'yes you're right' but also there's no need to spend extra (either up-front purchase price or power budget) if you are never tapping into the performance. Hopefully that makes sense.

Also note that this is a relatively recent feature for GPUs to be this power friendly - even going back a few generations it isn't uncommon to find 100W+ idle states, so bear this in mind if you go looking around on ebay for a bargain.
I'm not sure what is going on on eBay right now, though current times are strange. I can't remember exactly how much I paid for my RX460, I think it was around £55, but they seem to be on there for £50 to £85, apart from one at £9 and some ones with stupidly high prices for reasons I cannot guess, unless they are hoping people will buy them in error.

I realize that the trick on eBay is patience. However, I want a working card and monitor at the same time.




Quote:

I've never had issues with any graphics card running 'mixed' outputs like that - my Radeon and GeForce cards do it regularly, without issue. This has been my experience for at least the last 10 years - the only 'conditionals' on this are cards that offered TV out (e.g. S-Video or YPbPr) could be weird if that was one of the monitors in a multi-head mode, and some of the Matrox cards from ~2000 tended to want all the connections the same. But that's ancient history.

One thing to consider: I'm assuming you're going 100% digital connections here - DVI, HDMI, or DP in any combination. All of these connections on modern displays and graphics cards will support HDCP. VGA and other analog connections will not support HDCP, which can cause problems with some DRM protected content (e.g. online streaming video), regardless of the GPU supporting the feature. If HDCP does not matter for you, this distinction does not matter.
I'm going to ditch the CRT, so analogue is irrelevant. A 3 headed setup with a 4k, a 1280x1024 and a CRT would be an oddball and frankly the CRT takes up too much space. The fact that its video lead is captive does not help moving things around. Good point on the HDCP.

Also worth noting that the new 4k monitor is very likely cheaper than what I paid for the CRT, that's progress.

Quote:
Generally I get where you're coming from - most LCDs over the last 15 years are somewhere between 1440x900 and 1920x1080 - 2560x1600 and 2560x1440 are somewhat 'exotic' (although prices have come down a lot in the last few years), and 4K even moreso. Have you ever considered having 'stacked' monitors or using a monitor in portrait mode, if you need vertical resolution? A 1080p monitor on its side is almost 2000px tall, and makes a *great* reading display as long as the viewing angles don't go nuts when you rotate it.
I noticed that the market got stuck at 1980x1080 for years. Before things were smoothly progressing. Not surprising though I suppose as full HD was a high resolution when it came out, and then the price of 1980x1080 dropped so fast, likely owing to economies of scale, that any other resolution was going to be much more expensive.

Quote:
Just my opinion/experience: I really don't understand the need for pixel pitch below ~.220mm. I have tried most sizes/resolutions of monitors over the years - I've had 30" 1600p, 23" DCI 2K (2048x1152), 43" 4K, CRTs that go all the way up to QXGA, etc and I consistently find myself coming back to multi-monitor 24" 1080p displays. The ~.250mm pixel pitch will get you identical sizing to a 20.1" 1600x1200 monitor, and thats roughly what translates up to the 43" 4K. At the 27-28" you're getting closer to IBM T220 density, which again, I just do not personally understand the need for; if it works for you, great, there's a lot more affordable options than just a few years ago!
well, I made my mind up. I had to jump for 4k or 1440. I did see your point about rotating a standard HD display though but thought it might be a bit narrow.

Quote:
I would pass on the ultrawide or super-ultrawide (21:9 or >21:9) monitors, and here's why: basically nothing is produced for that form factor. So any video will be pillar boxed or window boxed, and many games will even have issues at that resolution. If you're just running X applications (like browsers, word processors, terminal windows, etc) the price premium doesn't seem worthwhile vs getting 2+ standard monitors and having them side by side, because you're going to be playing window tetris to size things out. Basically, I would avoid things that aren't 'standard' - 1080p, 4K, and to a lesser extent 1440p are all pretty common, and therefore will be nicely compatible with lots of stuff, even if you're scaling (e.g. 1080p video scales nicely to 4K).
That is useful information. Also gives me the option to switch the second monitor to my old machine that I currently use as (almost) a NAS. I'l keep my existing LCD for that probably.

Quote:
FWIW the version # looks pretty up to date - my Xubuntu install shows 440.64.
Though I mainly use KDE4 in Slackware I often use XFCE, especially then kde plays up. I find XFCE just does what you want, and no more*, no less, with no fuss.

* well, it probably does a bit more, but I've not needed to check.

Quote:
My desk has 6 1080p monitors (technically 5x 1080p and 1x 2K, but close enough) in a 3 wide x 2 high configuration. This replaced a single 43" 4K monitor, and is effectively half-over again the total pixels. I do not have a single machine connected to all 6 displays presently, and usually only have the bottom 3 active for 'normal use' (because I just don't need the extra space a lot of the time), but if I have multiple machines running its easier to be able to 'break up' my display space with all the separate monitors.
Ok. Bet you did not try that when all we had were CRTs! Years back I bought a old ancient and probably very end CRT that apparently came from an architects practice. I had to move it on immediately as it took up far too much space.

I did breifly wonder about monitors that could have picture in picture, or place multiple side by side. But I expect that you have to pay a lot for that and unless you have a specific use changing that configuration around is likely a pain.

Quote:

I think any of the 'newer' AMD cards should be a good candidate - Polaris or newer will get you HEVC, 4K support, modern HDMI and DP, and there's a lot of 'cheap' options. If the power isn't a problem (for the system's PSU), something like a Radeon RX 590 would be an easy 'slam dunk' in doing everything you could think up - probably overkill based on what you've described. One of the lower-tier 400/500 series midrange cards should be equivalent (note that I don't know for certain there aren't weird rebrands there - check Wikipedia to be sure) apart from the 3D performance.
Yep, when RX590 in the end.

Quote:
Finally, an off-the-wall idea to consider: if your motherboard has video outputs, there are AM4 Ryzen APUs that feature Vega graphics. You could in theory replace the 2700X with one of those, and have an all-in-one solution. I don't know if this would be a good or bad idea because I haven't followed the newer Ryzen chips very closely, so you may lose cores or some other feature to gain the GPU, but I know my older APUs are pretty competent as their own graphics cards (I have an A10-5800k in a desktop, and an A10-9620 in a laptop). One model I know of is the 'Ryzen 3400G.'

Hope some of this at least helps.
Yes it helped a lot. I've probably overcooked it with a RX590, but I think overall I've got good value. And that does give me some space to play with more GPU intensive stuff. I'll find out how it plays out when it all shows up.

Regarding the APU front. My motherboard has no video outs. I find it quite amazing that, dependent on workload, my 2 year old 2700X is being pushed quite a way down the performance rankings by processors just a generation newer. So I no longer have the latest and greatest, but this is all good for the consumer. It looks like the Ryzen 4700G, an eight core / sixteen thread APU is coming. But I'd have to start again with a new motherboard to use that. But I suspect that it is fantastic value if you need a powerful machine with modest graphics.
 
Old 06-22-2020, 08:46 AM   #14
EdGr
Member
 
Registered: Dec 2010
Location: California, USA
Distribution: I run my own OS
Posts: 982

Rep: Reputation: 465Reputation: 465Reputation: 465Reputation: 465Reputation: 465
Quote:
Originally Posted by petejc
In the end I went with a 4k monitor, iiyama XUB2792UHSU-B1 and an RX590 GPU.
You are going to be very happy with that setup. You will wonder how you got by without it.

Quote:
Originally Posted by obobskivich
Just my opinion/experience: I really don't understand the need for pixel pitch below ~.220mm. I have tried most sizes/resolutions of monitors over the years - I've had 30" 1600p, 23" DCI 2K (2048x1152), 43" 4K, CRTs that go all the way up to QXGA, etc and I consistently find myself coming back to multi-monitor 24" 1080p displays. The ~.250mm pixel pitch will get you identical sizing to a 20.1" 1600x1200 monitor, and thats roughly what translates up to the 43" 4K. At the 27-28" you're getting closer to IBM T220 density, which again, I just do not personally understand the need for; if it works for you, great, there's a lot more affordable options than just a few years ago!
I think that experience varies with vision.

At normal viewing distance, I can see an individual pixel on my 28" 4K (157 DPI, 0.161mm dot pitch). 4K is still not as good as human vision. The display manufacturers know that. It is not until 8K that displays will be better than human vision.
Ed
 
Old 06-22-2020, 04:13 PM   #15
petejc
Member
 
Registered: Apr 2019
Distribution: Slackware
Posts: 134

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by EdGr View Post
You are going to be very happy with that setup. You will wonder how you got by without it.
Yes, certainly. Just concerned that whilst my home setup will be good, that is the only one I can influence...


Quote:

I think that experience varies with vision.

At normal viewing distance, I can see an individual pixel on my 28" 4K (157 DPI, 0.161mm dot pitch). 4K is still not as good as human vision. The display manufacturers know that. It is not until 8K that displays will be better than human vision.
Ed
Yes, I did the sums myself. Even with SD TV the point was was that you sized the set, at your given viewing distance, such that you could not see the lines. At the very worse with this setup, if 4k really is overcooking it, I'd just have to make the fonts and icons larger. Mind you, over 40 my eyesight did start getting worse.
 
  


Reply

Tags
gaming, gpu, workstation


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Handbrake GPU Acceleration - Inexpensive AMD GPU for Old PC Mr. Macintosh Linux - Software 8 01-03-2018 03:11 PM
how can I setup the amd GPU as a default gpu instead of intel graphics? divinefishersmith Linux - Newbie 33 08-22-2015 06:03 PM
AMD GPU and AMD overdrive stratotak Linux - Software 0 05-15-2014 11:45 AM
Tried to swap GPU in HP workstation. GPU not working good. LexMK Linux - Hardware 1 06-21-2013 06:59 PM
[SOLVED] bash - versus --perl - versus python ow1 Linux - Software 2 05-03-2010 07:57 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Hardware

All times are GMT -5. The time now is 12:41 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration