LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Hardware
User Name
Password
Linux - Hardware This forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?

Notices


Reply
  Search this Thread
Old 06-22-2020, 08:55 PM   #16
obobskivich
Member
 
Registered: Jun 2020
Distribution: Xubuntu / Slackware
Posts: 35

Rep: Reputation: Disabled

Quote:
Originally Posted by petejc View Post
Firstly I'd like to thank yourself, EdGr and MrMazda. This has all been very useful information.

In the end I went with a 4k monitor, iiyama XUB2792UHSU-B1 and an RX590 GPU. I expect whichever way I went it would have been a decent solution. The Iiyama seems very good value. From reviews it appeared that if I wanted better colour fidelity for my photograph, which I don't do much of nowadays, I'd have to spend considerably more and if a gamer I'd have gone for a higher end card, and a faster 1440 monitor, so this seems sensible. I'll find out when it arrives. Regarding the GPU I suspect my RX460 would have been fine were it not dead. I went down the new route as it seems that second hand Polaris GPUs on eBay are barely cheaper than new ones and a RX590 is no dearer than an RX580 and only marginally more expensive than an 8GB RX570. But then I've fallen for the marketers who are pushing you to just spend that little bit more.
I've never had an Iiyama display myself, but I've always read good things about them. I think overall you made some good picks - RX 590 is a mature (in a good way) card and will connect to most anything. It should have enough performance to handle 4K not just for desktop use (of course) but even some games and other 3D applications (there was a time, not so long ago, where it was not 'mid range' in the performance it offered, but closer to the top of the heap). Just to clarify: RX 590 is just a slight refresh of the 580, so they're otherwise basically the same (most 590s come with somewhat higher maximum clockspeed as well (like a few %)).

Quote:

I'm not sure what is going on on eBay right now, though current times are strange. I can't remember exactly how much I paid for my RX460, I think it was around 55, but they seem to be on there for 50 to 85, apart from one at 9 and some ones with stupidly high prices for reasons I cannot guess, unless they are hoping people will buy them in error.

I realize that the trick on eBay is patience. However, I want a working card and monitor at the same time.
Who knows - some years ago these Radeon cards were very popular for cryptocurrency, but I think that has mostly died down now.



Quote:

I'm going to ditch the CRT, so analogue is irrelevant. A 3 headed setup with a 4k, a 1280x1024 and a CRT would be an oddball and frankly the CRT takes up too much space. The fact that its video lead is captive does not help moving things around. Good point on the HDCP.

Also worth noting that the new 4k monitor is very likely cheaper than what I paid for the CRT, that's progress.


I noticed that the market got stuck at 1980x1080 for years. Before things were smoothly progressing. Not surprising though I suppose as full HD was a high resolution when it came out, and then the price of 1980x1080 dropped so fast, likely owing to economies of scale, that any other resolution was going to be much more expensive.


well, I made my mind up. I had to jump for 4k or 1440. I did see your point about rotating a standard HD display though but thought it might be a bit narrow.


That is useful information. Also gives me the option to switch the second monitor to my old machine that I currently use as (almost) a NAS. I'l keep my existing LCD for that probably.


Though I mainly use KDE4 in Slackware I often use XFCE, especially then kde plays up. I find XFCE just does what you want, and no more*, no less, with no fuss.

* well, it probably does a bit more, but I've not needed to check.
I would agree with most of this - why I don't use CRTs anymore, XFCE being nice, etc.

Quote:
Ok. Bet you did not try that when all we had were CRTs! Years back I bought a old ancient and probably very end CRT that apparently came from an architects practice. I had to move it on immediately as it took up far too much space.

I did breifly wonder about monitors that could have picture in picture, or place multiple side by side. But I expect that you have to pay a lot for that and unless you have a specific use changing that configuration around is likely a pain.
'Back in the day' I had 4x CRT in a 4-wide configuration at my desk, it worked quite well. They were 17" models, and I tended to run them at 1152x864 @ 70Hz on the desktop, and whatever resolution the computer(s) could handle for 3D. I miss the 'easier' variable resolution on CRTs, but otherwise I like not dealing with the depth and weight. I also, more recently, had a big 21" Sun CRT integrated with the LCDs, but it (sadly) failed a few years ago, after probably 20 years of overall service.

Picture-in-picture is an odd feature for a computer monitor - a lot of TVs and monitors I've seen with the feature don't tell you that the PIP window is also down-scaled resolution - I've seen some newer LG monitors that can do it native resolution on 4K or Ultrawide panels (you give up a good portion of the 'main' display too), which may be useful, but the 'standard' PIP feature is usually unusable with the second signal because of how much down-rezzing takes place (Text is very hard to read IME).


Quote:
Yes it helped a lot. I've probably overcooked it with a RX590, but I think overall I've got good value. And that does give me some space to play with more GPU intensive stuff. I'll find out how it plays out when it all shows up.
It will probably be just fine - even if it spends most of its life idling, that's fine too. But you've got some extra processing power on-tap for whatever you may want to try - GPGPU, gaming, video encode/decode, etc - never hurts to have 'a little extra' like that.

Quote:
Regarding the APU front. My motherboard has no video outs. I find it quite amazing that, dependent on workload, my 2 year old 2700X is being pushed quite a way down the performance rankings by processors just a generation newer. So I no longer have the latest and greatest, but this is all good for the consumer. It looks like the Ryzen 4700G, an eight core / sixteen thread APU is coming. But I'd have to start again with a new motherboard to use that. But I suspect that it is fantastic value if you need a powerful machine with modest graphics.
I've seen those 'new benchmarks show huge improvements' tests too - and my (older than yours) AMD processors still keep going just fine with programs, OSes, etc that I actually want to run. I think we're starting to get the place with 'general purpose' benchmarks that we arrived at years ago with 3D benchmarks - because processors and OSes and so forth have gotten so good, generally speaking, the synthetics have to 'keep up' to be able to actually show any degree of substantial difference. With 3D benchmarks I remember back to the mid-late 2000s when 3DMark, Uniengine, etc became so much more complex and 'heavy' than most real-world games, as to be worthless in helping predict if said real-world game would be a good candidate for the machine (because generally if the machine could do 3DMark or Uniengine or whatever 'very well' it was no questions for the game - it would run maxed out).



Quote:
Originally Posted by EdGr View Post
I think that experience varies with vision.

At normal viewing distance, I can see an individual pixel on my 28" 4K (157 DPI, 0.161mm dot pitch). 4K is still not as good as human vision. The display manufacturers know that. It is not until 8K that displays will be better than human vision.
Ed
I'm not trying to be 'hostile' here, just wanted to add:

I think you're right, but I've also heard arguments like this before - 'some day high enough resolution will negate the need for anti-aliasing, will exceed the capacity of human vision, just in the next upgrade it will come!' - and I've seen that target be punted from UXGA to QXGA to 9MP to Full HD to 4K to 8K and so on over the years. I'm writing this reply on a 17" laptop with a 1080p display (around 130 ppi), and just as you say, 'can 'see' individual pixels' as well. What I'm getting at though is: 8K monitors have been out for some time (Dell makes one, for example), and I will not be surprised once 8K becomes more mainstream, that we'll be hearing 'It is not until 16K that displays will be better than human vision.' My skepticism there is more towards the makers of such displays and so forth - they obviously have an interest in selling us more monitors so why wouldn't they tout their latest as 'the best ever?' So I think you're onto something with the 'personal' aspect too - it's really a combination of use-case, workflow, preference, and personal needs as to what will work best. For me, I just cannot get my head around why I would want to go back to the T221-style high DPI and having to upscale/uprez everything just to be compatible (I remember this with 21" CRTs, for what its worth). I've played around with the newer 5K iMacs at various places, and sure they look very very nice with text documents (your example of side-by-side text documents being a really good one), but whenever you're switching to 3D there's probably scaling going on (as they often lack the power to do much at 5K), and any video content will assuredly be scaled as well. So none of that stuff is 'benefitting' from the fancy display, and what they can output at 5K is being DPI scaled so that it isn't super tiny on-screen, which means the 'working area' is probably about similar to the older 27-30" 1440-1600p monitors (in terms of how many actual inches across a window will be, for example), just at higher DPI. Fortunately a lot of this stuff is becoming more affordable and better supported over the last decade that we can even have the debate of 'what should I get?' There was a time when 'high DPI' was basically the exclusive domain of institutional users with expense accounts (if memory serves the T221 ran something like $12,000 US when introduced, and I know those hi-rez Sun and Sony 21" CRTs were into the $1000+ range as well).

* If someone is curious about the T221 statement, this is the kind of thing I'm talking about (and yes I'm well aware modern computers with more modern OS and so forth do a 'better job' with this):
https://en.wikipedia.org/wiki/IBM_T2...e:IBM_T221.jpg

That's a standard 80x24 terminal window - and that's a 22" monitor (with 9MP resolution - slightly higher than 4K owing to its 16:10 aspect ratio). The picture of course is not a perfect demonstration of this, but you get the idea.
 
1 members found this post helpful.
Old 06-23-2020, 02:45 AM   #17
mrmazda
Senior Member
 
Registered: Aug 2016
Location: USA
Distribution: openSUSE, Debian, Knoppix, Mageia, Fedora, others
Posts: 2,519

Rep: Reputation: 829Reputation: 829Reputation: 829Reputation: 829Reputation: 829Reputation: 829Reputation: 829
Quote:
Originally Posted by obobskivich View Post
the 'standard' PIP feature is usually unusable with the second signal because of how much down-rezzing takes place (Text is very hard to read IME).
I've been a PIP (and later PBP/POP) user 32 years. I seriously doubt PIP as currently implemented on PC displays carries the same expected use as it had when invented for TV. IMO current implementations are expected to be more likely used by cameras to detect motion, as whether the 3 month old is still asleep or squirming for a fresh diaper, or seeing who is at the doorbell. I'm still using a 12 year old TV because I can't find a new one that both fits the space I have available, and has PIP and POP as my old one has, or 8 total video inputs as the old. I can't remember how many PC displays I have with PIP or PBP, because other than when checking that functionality when acquired, I've never found the functional loss of resolution useful. I think the count is 3 out of 7 larger than 19" (not counting TVs, all of which do have PIP and/or POP).
 
1 members found this post helpful.
Old 06-23-2020, 04:53 PM   #18
petejc
Member
 
Registered: Apr 2019
Distribution: Slackware
Posts: 72

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by obobskivich View Post
I've never had an Iiyama display myself, but I've always read good things about them. I think overall you made some good picks - RX 590 is a mature (in a good way) card and will connect to most anything. It should have enough performance to handle 4K not just for desktop use (of course) but even some games and other 3D applications (there was a time, not so long ago, where it was not 'mid range' in the performance it offered, but closer to the top of the heap). Just to clarify: RX 590 is just a slight refresh of the 580, so they're otherwise basically the same (most 590s come with somewhat higher maximum clockspeed as well (like a few %)).
The old CRT was a Iiyama Vision Master Pro. I did not enjoy carrying that down the stairs.

Quote:
Who knows - some years ago these Radeon cards were very popular for cryptocurrency, but I think that has mostly died down now.
You don't here as much about it. It looks like bitcoin just under $10000 right now and the difficulty has been jumping up and down for a while. I've not really tried it, as when I looked I needed faster hardware and I was not going t buy hardware specially.


Quote:
I would agree with most of this - why I don't use CRTs anymore, XFCE being nice, etc.

Picture-in-picture is an odd feature for a computer monitor - a lot of TVs and monitors I've seen with the feature don't tell you that the PIP window is also down-scaled resolution - I've seen some newer LG monitors that can do it native resolution on 4K or Ultrawide panels (you give up a good portion of the 'main' display too), which may be useful, but the 'standard' PIP feature is usually unusable with the second signal because of how much down-rezzing takes place (Text is very hard to read IME).
I'd seen reviews of monitors that featured it on Youtube. For example ultra wides that could put two full HD displays from different sources side by side, of 4k monitors that could present 4x HD pictures from four sources. So no downscaling. I was just curious.

Quote:
It will probably be just fine - even if it spends most of its life idling, that's fine too. But you've got some extra processing power on-tap for whatever you may want to try - GPGPU, gaming, video encode/decode, etc - never hurts to have 'a little extra' like that.
I thought it too late in the evening to swap the GPU (must be getting old). So I plugged the 4k monitor into the old Turks based GPU. And it is driving it at higher than the max spec resolution, it is driving it at 4k. However, there is some noise / tearing on the right hand side of the screen. I hope it is the GPU and not the monitor. BTW, don't try a dual head setup of a 4k monitor and a 1280x1024. It just looks a mess.

Quote:

I've seen those 'new benchmarks show huge improvements' tests too - and my (older than yours) AMD processors still keep going just fine with programs, OSes, etc that I actually want to run. I think we're starting to get the place with 'general purpose' benchmarks that we arrived at years ago with 3D benchmarks - because processors and OSes and so forth have gotten so good, generally speaking, the synthetics have to 'keep up' to be able to actually show any degree of substantial difference. With 3D benchmarks I remember back to the mid-late 2000s when 3DMark, Uniengine, etc became so much more complex and 'heavy' than most real-world games, as to be worthless in helping predict if said real-world game would be a good candidate for the machine (because generally if the machine could do 3DMark or Uniengine or whatever 'very well' it was no questions for the game - it would run maxed out).
For me it is difficult to leverage this power in say python or PHP. Trying to create anything multi-threaded just hits a cliff face of complexity. I presume using openCL for the GPU power has a similar vertical learning curve.


Quote:
I'm not trying to be 'hostile' here, just wanted to add:

I think you're right, but I've also heard arguments like this before - 'some day high enough resolution will negate the need for anti-aliasing, will exceed the capacity of human vision, just in the next upgrade it will come!' - and I've seen that target be punted from UXGA to QXGA to 9MP to Full HD to 4K to 8K and so on over the years. I'm writing this reply on a 17" laptop with a 1080p display (around 130 ppi), and just as you say, 'can 'see' individual pixels' as well. What I'm getting at though is: 8K monitors have been out for some time (Dell makes one, for example), and I will not be surprised once 8K becomes more mainstream, that we'll be hearing 'It is not until 16K that displays will be better than human vision.' My skepticism there is more towards the makers of such displays and so forth - they obviously have an interest in selling us more monitors so why wouldn't they tout their latest as 'the best ever?' So I think you're onto something with the 'personal' aspect too - it's really a combination of use-case, workflow, preference, and personal needs as to what will work best. For me, I just cannot get my head around why I would want to go back to the T221-style high DPI and having to upscale/uprez everything just to be compatible (I remember this with 21" CRTs, for what its worth). I've played around with the newer 5K iMacs at various places, and sure they look very very nice with text documents (your example of side-by-side text documents being a really good one), but whenever you're switching to 3D there's probably scaling going on (as they often lack the power to do much at 5K), and any video content will assuredly be scaled as well. So none of that stuff is 'benefitting' from the fancy display, and what they can output at 5K is being DPI scaled so that it isn't super tiny on-screen, which means the 'working area' is probably about similar to the older 27-30" 1440-1600p monitors (in terms of how many actual inches across a window will be, for example), just at higher DPI. Fortunately a lot of this stuff is becoming more affordable and better supported over the last decade that we can even have the debate of 'what should I get?' There was a time when 'high DPI' was basically the exclusive domain of institutional users with expense accounts (if memory serves the T221 ran something like $12,000 US when introduced, and I know those hi-rez Sun and Sony 21" CRTs were into the $1000+ range as well).

* If someone is curious about the T221 statement, this is the kind of thing I'm talking about (and yes I'm well aware modern computers with more modern OS and so forth do a 'better job' with this):
https://en.wikipedia.org/wiki/IBM_T2...e:IBM_T221.jpg
in
That's a standard 80x24 terminal wdow - and that's a 22" monitor (with 9MP resolution - slightly higher than 4K owing to its 16:10 aspect ratio). The picture of course is not a perfect demonstration of this, but you get the idea.
I wonder if we will hit a limit where it really does not matter? Probably on the non-enthusiast end we've hit that. Perhap we are there now, physical size is what is key and resolution and afterthought for those that are concerned about it only? OTOH such people just lag the latest and greatest. With this monitor I'm finding I need to make fonts and icons larger. But they look really nice. I followed your link, I think I've seen that monitor reviewed on Youtube. I think the xterminal on this monitor looks similar.
 
Old 06-24-2020, 09:48 AM   #19
obobskivich
Member
 
Registered: Jun 2020
Distribution: Xubuntu / Slackware
Posts: 35

Rep: Reputation: Disabled
Quote:
Originally Posted by petejc View Post
The old CRT was a Iiyama Vision Master Pro. I did not enjoy carrying that down the stairs.
And I do remember reading about their CRTs as well - certainly a company that's been around for a while, if that counts for anything. Moving big CRTs is definitely something I was glad to 'leave behind' with the transition to LCDs...

Quote:

You don't here as much about it. It looks like bitcoin just under $10000 right now and the difficulty has been jumping up and down for a while. I've not really tried it, as when I looked I needed faster hardware and I was not going t buy hardware specially.
I think the complexity got beyond what a GPU could do, which is why the interest in GPUs faded - but I could be wrong on that. I'm just glad things seem to have 'settled' as it makes hardware availability better (there was a time where getting GPUs and some other components was interesting, to say the least).


Quote:
I'd seen reviews of monitors that featured it on Youtube. For example ultra wides that could put two full HD displays from different sources side by side, of 4k monitors that could present 4x HD pictures from four sources. So no downscaling. I was just curious.
I've also seen demos and reviews of those monitors, but never owned one myself - the PIP features I've experienced are either on TVs, or very old monitors (e.g. the Gateway XHD3000 had a PIP feature, which I think was effectively like a 720x480 viewport but could accept up to 1080p for scaling) and it was mostly a 'gimmick' for me. But with the non-scaled approach it does look pretty slick in a demo - especially those LGs that can roll up 4 sources into a single 4K output, since that's essentially a 'baby video wall' with no bezels (if it works as advertised).

Quote:
I thought it too late in the evening to swap the GPU (must be getting old). So I plugged the 4k monitor into the old Turks based GPU. And it is driving it at higher than the max spec resolution, it is driving it at 4k. However, there is some noise / tearing on the right hand side of the screen. I hope it is the GPU and not the monitor. BTW, don't try a dual head setup of a 4k monitor and a 1280x1024. It just looks a mess.
Interesting. From what I understand the older pre-GCN AMD GPUs' video controller/scaler does not support 4K, which is why so many older Radeon and FirePro cards cannot do 4K. But I don't have a card that's in that 'in-between' generation to test/play with (I jumped from Radeon 4800 to Radeon R9 290 - skipping some 3-4 generations at the time). If it works, I would expect it probably is doing it at a relatively low refresh rate (this seems to be the case with many older nVidia cards doing 4K at least - they will run it, but at 24 or 30 Hz), maybe that's where the tearing is from? (24Hz is usually pretty awful for 'desktop work' IME)




Quote:
I wonder if we will hit a limit where it really does not matter? Probably on the non-enthusiast end we've hit that. Perhap we are there now, physical size is what is key and resolution and afterthought for those that are concerned about it only? OTOH such people just lag the latest and greatest. With this monitor I'm finding I need to make fonts and icons larger. But they look really nice. I followed your link, I think I've seen that monitor reviewed on Youtube. I think the xterminal on this monitor looks similar.
I remember hearing years ago that the average user notices changes in size much more readily than resolution - something like most users would preference a 100" projector playing VHS or DVD at ~200-300 lines effective vs a 20" UXGA display with native content. I don't know if that has any basis in reality though, especially when a lot of 'average users' are probably using smartphones or other mobile devices, which tend to have VERY high DPI. And when that seems to be one of the 'big selling points' for new mobile devices (higher resolution) over ever-increasing size. I also notice this trend with laptops (which haven't really surpassed the 16-17" size, but have increased resolution quite a bit over the last 15 years (I think you can even have 4K at that size) - I still remember 16" laptops with 1280x800 displays ). So maybe it is just that 'desktop monitors' are lagging - who knows? And just to clarify - I have no problem with ever-higher resolutions, its more the idea of super-high DPI with no scaling that really 'bugs' me - the 4K/43" thing was absolutely fantastic until I needed to have multiple concurrent sources/devices with different connectivity.

On the 'other side' I know I've read about a lot of teething-pains with trying to transition to 8K - both in terms of the bandwidth required to connect it up, but also in terms of the computational requirements on the other end (e.g. gaming or other 3D work at 8K is probably still a ways off). So that may end up being a 'limit' regardless of what the display tech itself can do (I'm also thinking about how long 9MP and 4K formats existed prior to becoming viable as 'typical' data displays - that IBM dates back to the early 2000s, for example).

Last edited by obobskivich; 06-24-2020 at 09:53 AM.
 
Old 06-24-2020, 10:39 AM   #20
EdGr
Member
 
Registered: Dec 2010
Location: California, USA
Distribution: I run my own OS
Posts: 374

Rep: Reputation: 165Reputation: 165
Quote:
Originally Posted by petejc View Post
I wonder if we will hit a limit where it really does not matter? Probably on the non-enthusiast end we've hit that. Perhap we are there now, physical size is what is key and resolution and afterthought for those that are concerned about it only? OTOH such people just lag the latest and greatest. With this monitor I'm finding I need to make fonts and icons larger. But they look really nice. I followed your link, I think I've seen that monitor reviewed on Youtube. I think the xterminal on this monitor looks similar.
Displays and video will follow the same path as digital audio did 2-3 decades earlier.

CD-quality audio (16-bits sampled at 44.1kHz) is good enough for most people. Attempts to introduce high-resolution audio formats (SACD and DVD-Audio) failed in the early 2000s. Today, resolutions better than 16-bit/44.1kHz are used in recording studios, and the results are downsampled for consumers.
Ed
 
Old 06-24-2020, 11:25 AM   #21
mrmazda
Senior Member
 
Registered: Aug 2016
Location: USA
Distribution: openSUSE, Debian, Knoppix, Mageia, Fedora, others
Posts: 2,519

Rep: Reputation: 829Reputation: 829Reputation: 829Reputation: 829Reputation: 829Reputation: 829Reputation: 829
Quote:
Originally Posted by obobskivich View Post
...with the non-scaled approach it does look pretty slick in a demo - especially those LGs that can roll up 4 sources into a single 4K output, since that's essentially a 'baby video wall' with no bezels (if it works as advertised).
I had a 43" LG for almost a month. 4 inputs ouput at once worked fine for me.

It didn't know what black meant. Brightness and contrast were inexplicably poor, but the major problem for me was the required remote control used various Vizio codes, incompatible with the two irreplaceable Vizios already in the room. The second Vizio has a bluetooth remote, so no interference between those two.
 
Old 06-25-2020, 09:03 AM   #22
obobskivich
Member
 
Registered: Jun 2020
Distribution: Xubuntu / Slackware
Posts: 35

Rep: Reputation: Disabled
Quote:
Originally Posted by mrmazda View Post
I had a 43" LG for almost a month. 4 inputs ouput at once worked fine for me.

It didn't know what black meant. Brightness and contrast were inexplicably poor, but the major problem for me was the required remote control used various Vizio codes, incompatible with the two irreplaceable Vizios already in the room. The second Vizio has a bluetooth remote, so no interference between those two.
I'm curious what do you mean by 'it didn't know what black meant' ?

I can completely understand the issue with the remote - I remember that being one of my chief issues with the XHD3000 years ago as well. Monitors that require remotes are a bit of an odd duck to be sure.
 
Old 06-25-2020, 09:30 AM   #23
mrmazda
Senior Member
 
Registered: Aug 2016
Location: USA
Distribution: openSUSE, Debian, Knoppix, Mageia, Fedora, others
Posts: 2,519

Rep: Reputation: 829Reputation: 829Reputation: 829Reputation: 829Reputation: 829Reputation: 829Reputation: 829
The only way to see a black screen was to turn it off, as if there was no black level. "Black" was always just dark gray.
 
Old 06-25-2020, 10:11 PM   #24
obobskivich
Member
 
Registered: Jun 2020
Distribution: Xubuntu / Slackware
Posts: 35

Rep: Reputation: Disabled
Quote:
Originally Posted by mrmazda View Post
The only way to see a black screen was to turn it off, as if there was no black level. "Black" was always just dark gray.
Ah - here I had thought that was mostly a 'thing of the past' with modern LCDs. That's unfortunate to hear.
 
Old 06-29-2020, 07:58 AM   #25
petejc
Member
 
Registered: Apr 2019
Distribution: Slackware
Posts: 72

Original Poster
Rep: Reputation: Disabled
Just to let you know it is working fine. However, to get it working I ended up reinstalling the OS after kernel updates and various other guides did not do the trick. With slackware-current it just worked. However, hosed a couple of my filesystem trying to sort out an unrelated issue.

The extra screen area is appreciated - I'm just getting used to it. With hindsight however, I think I should have listened to the suggestion of getting a larger monitor. Yes, I could get a second 28" monitor on my desk. However, the 28 4K display is only about 2" taller than my old 1280x1024. So I have more than double the resolution but the vertical size is not much greater. The picture however is very clear.
 
Old 06-29-2020, 11:58 AM   #26
EdGr
Member
 
Registered: Dec 2010
Location: California, USA
Distribution: I run my own OS
Posts: 374

Rep: Reputation: 165Reputation: 165
The top of the screen should be at or slightly below eye level. This is standard ergonomics.

Incidentally, monitor manufacturers seem to be engaged in a "brightness war". I turn down the brightness to near minimum.
Ed
 
Old 06-29-2020, 01:23 PM   #27
petejc
Member
 
Registered: Apr 2019
Distribution: Slackware
Posts: 72

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by EdGr View Post
The top of the screen should be at or slightly below eye level. This is standard ergonomics.

Incidentally, monitor manufacturers seem to be engaged in a "brightness war". I turn down the brightness to near minimum.
Ed
If I sit bolt upright my eyes are about level with the screen top. Partly as a result of a monitor 'shelf' I made to give me more desk space. I will probally attach a VESA arm to this which will give me an opportunity ot adjust things a little.

Provided it is not compulsory I could see it being useful. However, I get your point WRT eyestrain. There is a splash-screen on starting up this monitor, which thankfully you can turn off. Otherwise you need to be wearing a welding visor just to turn the thing on.
 
Old 06-30-2020, 11:27 PM   #28
obobskivich
Member
 
Registered: Jun 2020
Distribution: Xubuntu / Slackware
Posts: 35

Rep: Reputation: Disabled
Quote:
Originally Posted by EdGr View Post
Incidentally, monitor manufacturers seem to be engaged in a "brightness war". I turn down the brightness to near minimum.
The HDR standard requires fairly high peak brightness from modern displays, especially if they are seeking HDR1000 certification. This doesn't mean they will run at this peak brightness continuously (generally that generates too much heat), but that is a factor in why many modern displays are capable of being so 'bright' (and I would suspect even models that aren't seeking HDR certification, still provide high peak brightness levels to attain better contrast ratio #s). High refresh rate displays with backlight strobing (like BenQ's DyAc) also tend to need higher peak brightness to compensate for the feature. For 'standard desktop use' this is all generally overkill, but there are other use-cases where HDR or 'just' good DCR support is actually worthwhile (like video). I agree with you on turning down brightness (I think most of my monitors are set somewhere between 5 and 10...out of 100) - and usually HDR/DCR/strobing features can be turned off as well, fortunately.
 
  


Reply

Tags
gaming, gpu, workstation


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Handbrake GPU Acceleration - Inexpensive AMD GPU for Old PC Mr. Macintosh Linux - Software 8 01-03-2018 03:11 PM
how can I setup the amd GPU as a default gpu instead of intel graphics? divinefishersmith Linux - Newbie 33 08-22-2015 06:03 PM
AMD GPU and AMD overdrive stratotak Linux - Software 0 05-15-2014 11:45 AM
Tried to swap GPU in HP workstation. GPU not working good. LexMK Linux - Hardware 1 06-21-2013 06:59 PM
[SOLVED] bash - versus --perl - versus python ow1 Linux - Software 2 05-03-2010 07:57 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Hardware

All times are GMT -5. The time now is 03:44 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration