LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Hardware
User Name
Password
Linux - Hardware This forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?

Notices


Reply
  Search this Thread
Old 08-25-2023, 09:34 PM   #1
ordealbyfire83
Member
 
Registered: Oct 2006
Location: Leiden, Netherlands
Distribution: LFS, Ubuntu Hardy
Posts: 305

Rep: Reputation: 89
How to get better contrast out of Intel GPU's?


Is it at all possible to increase the contrast on a laptop with integrated Intel graphics (probably GM45). I have a Lenovo Thinkpad T500 and have done some measurements with a colorimeter. The physical LCD panel is an LG-Phillips, but I forget the actual model number. I'd have to take the case apart again to see the label. The colorimeter reports a contrast around 50:1. But if I take out the panel and put it in a Dell laptop with an nvidia GPU, the colorimeter shows 245:1.

It seems laptop manufacturers simply pass off the LCD manufacturer's theoretical specs when providing contrast measurements instead of saying what kind of results will be achieved in that laptop.

I've done several measurements at different brightness levels. The contrast is measured the same, but the eye perceives the contrast to be better at higher brigheness. But it is not. The problem is that the black point is simply too bright. As such it is really not feasible to look at digital photographs, and I have to plug in an external display for this purpose.

It seems that the GPU uses pulse width modulation. Does anyone know any hack for contrast using intel_reg?

At first I thought it was just my laptop. But I have now checked three T500's and a Thinkpad Edge. They all do the same, although the Edge is even worse. It shows a contrast of 44:1 with a white point of 7300K. Seriously, no photograph could ever look good on that.
 
Old 08-26-2023, 09:03 AM   #2
business_kid
LQ Guru
 
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,632

Rep: Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635
That's an anything-but-new laptop. LCD is the worst form of screen technology. Possible causes include
  • Aging display, especially with LCD.
  • Complications from ageing hardware.
  • Power limits if it's on battery only.
About your only adjustment is to run it on the external power supply and set the brightness & contrast for best effect. If you connect an external monitor and that looks good, it's a screen problem. I have retired a better laptop - better cpu, better but still sucky Intel integrated graphics. At least the screen works as well as it ever did.

I'd also suggest an attitude correction. You have a laptop to run where you haven't the space or mains supply for a desktop. It's not perfect. If you have the means and want to, upgrade the thing. If not, don't spend too much time trying to make it something it isn't.

Last edited by business_kid; 08-26-2023 at 09:07 AM.
 
Old 08-26-2023, 09:21 AM   #3
Emerson
LQ Sage
 
Registered: Nov 2004
Location: Saint Amant, Acadiana
Distribution: Gentoo ~amd64
Posts: 7,675

Rep: Reputation: Disabled
Have you tried these: ddcutil, ddccontrol? These utils interact with your display, not with graphics card.

Another interesting tool is redshift, if this can adjust your brightness then there must be a way to adjust the video output properties. You just have to figure it out.
Edit: I guess that is it, you can do it in xorg.conf.

Last edited by Emerson; 08-26-2023 at 09:29 AM.
 
Old 08-26-2023, 11:08 AM   #4
ordealbyfire83
Member
 
Registered: Oct 2006
Location: Leiden, Netherlands
Distribution: LFS, Ubuntu Hardy
Posts: 305

Original Poster
Rep: Reputation: 89
The age and the kind of hardware are not the issues here. The Thinkpad T500 can also accept LED screens, provided that you use the appropriate inverter. I have measured no less than four screens of both types and every measurement gave approximately 50:1. The problem is that when the brightness is increased, the black point gets brighter in tandem with the overall brightness. On laptops with other GPU's the black point might move up marginally with the overall brightness.

I've read about this on both the Coreboot and Libreboot documentation, some of which has been edited over the years, that Intel uses pulse width modulation to control the brightness. I have no idea whether AMD and NVidia do this as well. This is independent from the display refresh rate, and the pulse width modulation has to be in phase with (ie a multiple of) the display refresh rate. Most of the tweaks that I've seen use intel_reg to increase the total brightness. The problem is that between these pulses the display never goes dark enough.

There is also a variant of the T500 that has switchable Intel and AMD GPU's. I have not done the measurements personally, although I loaned my colorimeter to a friend with this laptop, who reported about 50:1 on Intel and about 250:1 on AMD *with the same screen*.

If I can borrow the Thinkpad Edge again (which also had an unused Windows partition), I may try to boot Windows and measure the contrast there. That will tell whether this is a Linux driver issue.

Sure I know that some will think I'm expecting too much here. But I also have an equally old Acer Iconia (radeon graphics) that I bought for a fraction of the price that will give almost six times the contrast.

Not to mention the Dell laptop that I tested this screen in, which gave 5 times the contrast, was from 2006.

No xgamma won't fix the problem. All that does is adjust the curves between the black and white points differently. But it does not fundamentally change the black or white points. In other words, while the eye will notice some difference, the colorimeter will not. Also this kind of adjustment is done at the sacrifice of color correctness.

Also ddcutil and ddccontrol don't work on laptop screens as per their documentation.
 
Old 08-27-2023, 05:51 AM   #5
business_kid
LQ Guru
 
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,632

Rep: Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635
Quote:
Originally Posted by ordealbyfire83
There is also a variant of the T500 that has switchable Intel and AMD GPU's. I have not done the measurements personally, although I loaned my colorimeter to a friend with this laptop, who reported about 50:1 on Intel and about 250:1 on AMD *with the same screen*.
That sounds like your problem solved right there. If all tests are being done with the external power attached, that's your problem. Good work, and mark this solved.

It's important to understand the difference between brightness & contrast. With minimum contrast there will be minimal distinction between blacks, & whites, or any other colour. Midway colours are seen. Brightness will adjust what colour you see.

As you increase contrast, whites will appear whiter, reds redder, etc. until you hit the the limits of your screen performance.

Black on screen is actually a fiction. It's simply lack of signal. The colour you see is the screen colour, which appears black in dark lighting. So Brightness when set too high will illuminate black areas.

Operating the screen in sunlight will play havoc with screen performance. It is (or was) possible to have too much contrast. The eye is analogue, and very sharp square wave type light changes produce a ringing effect and eyestrain. It was an occasional issue in the days of thermionic CRTs, but I haven't seen it with today's lower power stuff.
 
Old 08-27-2023, 01:42 PM   #6
ordealbyfire83
Member
 
Registered: Oct 2006
Location: Leiden, Netherlands
Distribution: LFS, Ubuntu Hardy
Posts: 305

Original Poster
Rep: Reputation: 89
Yes, but at this point actual contrast as measured by the colorimeter (white brightness / black brightness, both in cd/m^2) is more important than contrast as seen by the eye under various ambient lighting conditions. First of all, without sufficient contrast, it is impossible to generate any icc profile that actually works. Manipulating any color or apparent color temperature of the white point is all done at the expense of contrast. And if the hardware only permits 50:1 at full brightness, then that's a serious problem. Likewise all xgamma essentially would do would be to adjust how much detail is in the shadows or highlights, provided that r/g/b are all adjusted linearly. The Intel GPU's allow for a gamma correction to be done in hardware, but that too is limited to the physical capabilities of the GPU and in this case would not help.

After reading through the manual, it looks like the only thing that can be done is to tweak pulse width modulation. This is in register 0x00061254. This register contains (in hex format) two portions, the period of the PWM as well as a divisor (ie duty cycle). This might sound sound contradictory, but the "period" itself is really some multiple of 128 of the GPU's system clock. When you change the brightness on the keyboard (or panel applets), the divisor changes. The higher two bytes of the register need to be chosen carefully because that will determine how much flicker you see. The image in the panel will refresh at 60Hz and if the PWM frequency is not in phase, then you have created a strobe light. According to some documentation this can cause epilepsy or eye strain at the very least. And even if you can't perceive it with the eye, try looking at the screen with a video camera. Then the combination of the two registers needs to be adjusted more. And if it is set the other way, then you will hear ringing. I've tried different values for this period, but the effect is always the same, namely that the display never doesn't have a dull or gray appearance.

Lowest / highest brightness settings:

Black level = 0.0965 / 2.8652 cd/m^2
50% level = 0.94 / 32.65 cd/m^2
White level = 3.98 / 141.08 cd/m^2
Aprox. gamma = 2.09 / 2.11
Contrast ratio = 41:1 / 49:1

Unless there is a way to keep the lamp off for a longer duration than it is on, or to flash the lamp at something other than full brightness, then the it will not go dark enough. It's simply not that fast. So apparently the radeon gpu does not do PWM if a higher contrast can be achieved on the same panel.

For reference I have taken measurements at a white brightness set in ArgyllCMS to 10 cd/m^2 of three laptops with Intel, NVidia, and AMD graphics. From a distance the Intel GPU looks a lot more "gray" and the grayness only subsides only near full brightness but even then the contrast never goes over 50:1.

So for the sake of one's eyesight it is probably best to avoid PWM graphics.

Last edited by ordealbyfire83; 08-27-2023 at 02:14 PM.
 
Old 08-28-2023, 06:19 AM   #7
business_kid
LQ Guru
 
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,632

Rep: Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635Reputation: 2635
I have asked several questions in this thread, and none of them ghave been addressed, so I can't really comment. Enjoy yourself.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Handbrake GPU Acceleration - Inexpensive AMD GPU for Old PC Mr. Macintosh Linux - Software 8 01-03-2018 03:11 PM
Does blacklisting discrete GPU driver completely disables discrete GPU, or using acpi_call is better decision? SuperPrower Linux - Hardware 2 08-21-2017 08:32 PM
how can I setup the amd GPU as a default gpu instead of intel graphics? divinefishersmith Linux - Newbie 33 08-22-2015 06:03 PM
Tried to swap GPU in HP workstation. GPU not working good. LexMK Linux - Hardware 1 06-21-2013 06:59 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Hardware

All times are GMT -5. The time now is 01:57 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration