External display pixelated after upgrade to 4.11 kernel
Linux - HardwareThis forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Appending drm.debug=0xe to end is fine and easy. You might want to try removing quiet so that init messages can be seen whizzing by. Maybe there's some clue you could spot and later look up in dmesg or the system journal.
It looks like those logs got cut off (they start at 11 seconds.) Did you run "dmesg > file" to collect the full output into the file? You can attach the full logs to the bug report along with a short description of the problem for you; I'm not sure what specifically in the logs would show the problem, but I assume the people on freedesktop.org know what they're looking at.
I don't think the microcode is related, but good to know that Kaby Lake is fixed now. I'm not able to disable hyperthreading in the UEFI on my computer, so that was kinda scary. I had been checking periodically but I didn't know it had been fixed yet.
Why are you people trying to use 1080p on a 1360x768 native TV? 1080 on a TV is 1080i, which is why the "720P" TV can play all standard HDTV modes. 1080p would have to be an interpolated mode on a 1360x768 native device, not expected to be good, at least not via (digital) HDMI (as AwesomeMachine suggested). VGA is analog, which works differently, and can be used to fake higher resolution than the physics of a display support.
One of my displays supports 2560x1080. Via VGA, DisplayPort and dual link DVI I can get 2560x1080 from it, but the (digital) DVI and (digital) DisplayPort quality in that mode is clearly far superior to VGA. Connecting to it via HDMI or single link DVI it's limited to 1920x1080, the limit of HDMI standards before 4k support was needed, and near the single link 1920x1200 DVI maximum.
Why a regression seems apparent in 4.11 kernels I can't answer, but I'm guessing before 4.11, specified 1080 progressive mode may have been falling silently back to interlaced, lying about what it was really doing. Vizio is not quite the equivalent of Sony or Samsung; it's known to take occasional shortcuts.
Why are you people trying to use 1080p on a 1360x768 native TV?
Neither of us are doing this. I'm trying to run mine at 1360x768 over HDMI, which used to work, and Skewered is trying to run theirs at 1920x1080 (native for that TV) over HDMI, which used to work also. I said I can force mine to do 1080, but it has always looked bad, so I never use it like that.
Quote:
Originally Posted by mrmazda
Why a regression seems apparent in 4.11 kernels I can't answer, but I'm guessing before 4.11, specified 1080 progressive mode may have been falling silently back to interlaced, lying about what it was really doing. Vizio is not quite the equivalent of Sony or Samsung; it's known to take occasional shortcuts.
If you look at the commit causing the issue from post #8, it seems to have something to do with RGB colorimetry and YQ bits(?) No idea what that means exactly, but they did put a note saying "Perhaps there are sinks that don't ignore the YQ as they should for RGB?" so maybe that applies to me? I'm not sure if by "sink" they mean the HDMI output on my computer, or the HDMI input on my TV. I noticed yours says "displayport adapted to HDMI." Are you using an actual Displayport out and an HDMI adapter, or a straight HDMI out?
In terms of Vizio cutting corners though... I absolutely can't argue there. I've been ready to throw this thing out the window for years, but it kills me to spend money replacing hardware because of issues caused by software, especially when it was working fine before.
I'm not sure if by "sink" they mean the HDMI output on my computer, or the HDMI input on my TV.
Interesting how difficult it is to Google a definition of sink in the context of HDMI, but think of it in terms of the noun definition 32 at http://www.dictionary.com/browse/sink?s=t and more specifically 37. A sink is a recipient, be it municipal or well water from faucet into kitchen or bathroom basin or rain and runoff into retention pond; heat from a CPU into a chunk of finned aluminum; or electronic output signals into speakers, TV, PC display, 4-way HDMI splitter or HDMI-to-composite converter. Recipient can also be inferred from some of the contexts googling finds, such as a provider of EDID data (PC display).
Quote:
I noticed yours says "displayport adapted to HDMI."
It seems common that Intel HDMI video as implemented by PC and motherboard manufacturers commonly provides exactly two physical connectors, one VGA, the other digital, and regardless of actual connector type, declares the digital CRTC signal HDMI. That PC like another I have has only VGA and DisplayPort connectors, while another has only VGA and DVI-D (shipped with a DVI-to-HDMI adapter). My Vizio TV like most HDTVs has only HDMI and analog inputs.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.