Linux - Hardware This forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux? |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
 |
08-25-2023, 09:34 PM
|
#1
|
Member
Registered: Oct 2006
Location: Leiden, Netherlands
Distribution: LFS, Ubuntu Hardy
Posts: 305
Rep:
|
How to get better contrast out of Intel GPU's?
Is it at all possible to increase the contrast on a laptop with integrated Intel graphics (probably GM45). I have a Lenovo Thinkpad T500 and have done some measurements with a colorimeter. The physical LCD panel is an LG-Phillips, but I forget the actual model number. I'd have to take the case apart again to see the label. The colorimeter reports a contrast around 50:1. But if I take out the panel and put it in a Dell laptop with an nvidia GPU, the colorimeter shows 245:1.
It seems laptop manufacturers simply pass off the LCD manufacturer's theoretical specs when providing contrast measurements instead of saying what kind of results will be achieved in that laptop.
I've done several measurements at different brightness levels. The contrast is measured the same, but the eye perceives the contrast to be better at higher brigheness. But it is not. The problem is that the black point is simply too bright. As such it is really not feasible to look at digital photographs, and I have to plug in an external display for this purpose.
It seems that the GPU uses pulse width modulation. Does anyone know any hack for contrast using intel_reg?
At first I thought it was just my laptop. But I have now checked three T500's and a Thinkpad Edge. They all do the same, although the Edge is even worse. It shows a contrast of 44:1 with a white point of 7300K. Seriously, no photograph could ever look good on that.
|
|
|
08-26-2023, 09:03 AM
|
#2
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,632
|
That's an anything-but-new laptop. LCD is the worst form of screen technology. Possible causes include - Aging display, especially with LCD.
- Complications from ageing hardware.
- Power limits if it's on battery only.
About your only adjustment is to run it on the external power supply and set the brightness & contrast for best effect. If you connect an external monitor and that looks good, it's a screen problem. I have retired a better laptop - better cpu, better but still sucky Intel integrated graphics. At least the screen works as well as it ever did.
I'd also suggest an attitude correction. You have a laptop to run where you haven't the space or mains supply for a desktop. It's not perfect. If you have the means and want to, upgrade the thing. If not, don't spend too much time trying to make it something it isn't.
Last edited by business_kid; 08-26-2023 at 09:07 AM.
|
|
|
08-26-2023, 09:21 AM
|
#3
|
LQ Sage
Registered: Nov 2004
Location: Saint Amant, Acadiana
Distribution: Gentoo ~amd64
Posts: 7,675
Rep: 
|
Have you tried these: ddcutil, ddccontrol? These utils interact with your display, not with graphics card.
Another interesting tool is redshift, if this can adjust your brightness then there must be a way to adjust the video output properties. You just have to figure it out.
Edit: I guess that is it, you can do it in xorg.conf.
Last edited by Emerson; 08-26-2023 at 09:29 AM.
|
|
|
08-26-2023, 11:08 AM
|
#4
|
Member
Registered: Oct 2006
Location: Leiden, Netherlands
Distribution: LFS, Ubuntu Hardy
Posts: 305
Original Poster
Rep:
|
The age and the kind of hardware are not the issues here. The Thinkpad T500 can also accept LED screens, provided that you use the appropriate inverter. I have measured no less than four screens of both types and every measurement gave approximately 50:1. The problem is that when the brightness is increased, the black point gets brighter in tandem with the overall brightness. On laptops with other GPU's the black point might move up marginally with the overall brightness.
I've read about this on both the Coreboot and Libreboot documentation, some of which has been edited over the years, that Intel uses pulse width modulation to control the brightness. I have no idea whether AMD and NVidia do this as well. This is independent from the display refresh rate, and the pulse width modulation has to be in phase with (ie a multiple of) the display refresh rate. Most of the tweaks that I've seen use intel_reg to increase the total brightness. The problem is that between these pulses the display never goes dark enough.
There is also a variant of the T500 that has switchable Intel and AMD GPU's. I have not done the measurements personally, although I loaned my colorimeter to a friend with this laptop, who reported about 50:1 on Intel and about 250:1 on AMD *with the same screen*.
If I can borrow the Thinkpad Edge again (which also had an unused Windows partition), I may try to boot Windows and measure the contrast there. That will tell whether this is a Linux driver issue.
Sure I know that some will think I'm expecting too much here. But I also have an equally old Acer Iconia (radeon graphics) that I bought for a fraction of the price that will give almost six times the contrast.
Not to mention the Dell laptop that I tested this screen in, which gave 5 times the contrast, was from 2006.
No xgamma won't fix the problem. All that does is adjust the curves between the black and white points differently. But it does not fundamentally change the black or white points. In other words, while the eye will notice some difference, the colorimeter will not. Also this kind of adjustment is done at the sacrifice of color correctness.
Also ddcutil and ddccontrol don't work on laptop screens as per their documentation.
|
|
|
08-27-2023, 05:51 AM
|
#5
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,632
|
Quote:
Originally Posted by ordealbyfire83
There is also a variant of the T500 that has switchable Intel and AMD GPU's. I have not done the measurements personally, although I loaned my colorimeter to a friend with this laptop, who reported about 50:1 on Intel and about 250:1 on AMD *with the same screen*.
|
That sounds like your problem solved right there. If all tests are being done with the external power attached, that's your problem. Good work, and mark this solved.
It's important to understand the difference between brightness & contrast. With minimum contrast there will be minimal distinction between blacks, & whites, or any other colour. Midway colours are seen. Brightness will adjust what colour you see.
As you increase contrast, whites will appear whiter, reds redder, etc. until you hit the the limits of your screen performance.
Black on screen is actually a fiction. It's simply lack of signal. The colour you see is the screen colour, which appears black in dark lighting. So Brightness when set too high will illuminate black areas.
Operating the screen in sunlight will play havoc with screen performance. It is (or was) possible to have too much contrast. The eye is analogue, and very sharp square wave type light changes produce a ringing effect and eyestrain. It was an occasional issue in the days of thermionic CRTs, but I haven't seen it with today's lower power stuff.
|
|
|
08-27-2023, 01:42 PM
|
#6
|
Member
Registered: Oct 2006
Location: Leiden, Netherlands
Distribution: LFS, Ubuntu Hardy
Posts: 305
Original Poster
Rep:
|
Yes, but at this point actual contrast as measured by the colorimeter (white brightness / black brightness, both in cd/m^2) is more important than contrast as seen by the eye under various ambient lighting conditions. First of all, without sufficient contrast, it is impossible to generate any icc profile that actually works. Manipulating any color or apparent color temperature of the white point is all done at the expense of contrast. And if the hardware only permits 50:1 at full brightness, then that's a serious problem. Likewise all xgamma essentially would do would be to adjust how much detail is in the shadows or highlights, provided that r/g/b are all adjusted linearly. The Intel GPU's allow for a gamma correction to be done in hardware, but that too is limited to the physical capabilities of the GPU and in this case would not help.
After reading through the manual, it looks like the only thing that can be done is to tweak pulse width modulation. This is in register 0x00061254. This register contains (in hex format) two portions, the period of the PWM as well as a divisor (ie duty cycle). This might sound sound contradictory, but the "period" itself is really some multiple of 128 of the GPU's system clock. When you change the brightness on the keyboard (or panel applets), the divisor changes. The higher two bytes of the register need to be chosen carefully because that will determine how much flicker you see. The image in the panel will refresh at 60Hz and if the PWM frequency is not in phase, then you have created a strobe light. According to some documentation this can cause epilepsy or eye strain at the very least. And even if you can't perceive it with the eye, try looking at the screen with a video camera. Then the combination of the two registers needs to be adjusted more. And if it is set the other way, then you will hear ringing. I've tried different values for this period, but the effect is always the same, namely that the display never doesn't have a dull or gray appearance.
Lowest / highest brightness settings:
Black level = 0.0965 / 2.8652 cd/m^2
50% level = 0.94 / 32.65 cd/m^2
White level = 3.98 / 141.08 cd/m^2
Aprox. gamma = 2.09 / 2.11
Contrast ratio = 41:1 / 49:1
Unless there is a way to keep the lamp off for a longer duration than it is on, or to flash the lamp at something other than full brightness, then the it will not go dark enough. It's simply not that fast. So apparently the radeon gpu does not do PWM if a higher contrast can be achieved on the same panel.
For reference I have taken measurements at a white brightness set in ArgyllCMS to 10 cd/m^2 of three laptops with Intel, NVidia, and AMD graphics. From a distance the Intel GPU looks a lot more "gray" and the grayness only subsides only near full brightness but even then the contrast never goes over 50:1.
So for the sake of one's eyesight it is probably best to avoid PWM graphics.
Last edited by ordealbyfire83; 08-27-2023 at 02:14 PM.
|
|
|
08-28-2023, 06:19 AM
|
#7
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,632
|
I have asked several questions in this thread, and none of them ghave been addressed, so I can't really comment. Enjoy yourself.
|
|
|
All times are GMT -5. The time now is 01:57 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|