Linux - DesktopThis forum is for the discussion of all Linux Software used in a desktop context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Running Kubuntu: Plasma Desktop Shell 4.11.11 on KDE 4.14.3 on Ubuntu 14.04.6 LTS on an AMD CPU with built-in graphics, monitor connected through DVI.
When I had to loan away my Samsung TV/monitor with 1920x1200 resolution I connected a smaller, lower resolution ViewSonic monitor. But unplugging the big monitor and plugging in the small one to the running system it automagically scaled the display so I got all the original pixels displayed on the smaller screen. You could see a little fuzziness but very usable.
However, after a reboot it comes up in 1680x1050 resolution and I can't get it to scale using xrandr. I can get the original-size virtual desktop with
But I have to pan around because the scaling simply and silently doesn't happen. Of course I didn't think to get the xrandr output while it was scaling.
What else can I try? What other info do you need to help?
Thanks
Last edited by LinuxOnly; 05-05-2020 at 04:20 PM.
Reason: corrected a typo in the code segment
Distribution: openSuSE Tumbleweed-KDE, Mint 21, MX-21, Manjaro
Posts: 4,629
Rep:
Oh dear. Your software is so old it is almost venerable. I use Plasma 5.18.4 with KDE 5.69.0 and Kernel 5.6.8...
Well. There is a system-settings thingy in Plasma. Dunno how your oldie looks like but in one of the menus / windows there are the screen settings. Try it there...
# find parameters; need to reduce VSync to 50Hz or it gets blurry. 45Hz makes a black screen with "Out of Range".
gtf 1920 1200 50
# output of gtf is a modeline, define it with xrandr:
xrandr --newmode "1920x1200" 158.08 1920 2032 2240 2560 1200 1201 1204 1235 -HSync +Vsync
# add the new mode to our display:
xrandr --addmode DVI-0 1920x1200
# run this mode
xrandr --output DVI-0 --mode 1920x1200
And I still don't know why xrandr --scale is a no-op.
And I still don't know why xrandr --scale is a no-op.
For a long time scaling was broken. 14.04 would likely fit that time frame.
What is the native/preferred mode and/or model of your Viewsonic? I would not expect nice results from forcing 1920x1200 on a display with native mode 1680x1050.
The monitor is a Viewsonic VA2026W and xrandr says its preferred mode, presumably native, is 1680x1050. I am attaching a screen shot to show it's quite legible.
A screenshot captures what is sent to the display, not what the display produces. AFAIK, only a camera shot or very high quality video can show us what it's showing you. Physical pixel pitch is usually matched to the native resolution, so a higher mode can only be simulated. But, as long as you're happy with what you see, nothing else should matter, unless you're driving it beyond it's design specifications, in which case expected lifetime might be diminished.
Thanks for not snickering about my screenshot - you are right, of course an X11 screenshot shows you only the ideal bits although on my screen I get the display artifacts when looking at the jpg.
I can't take a photo now since I got my big monitor back but here is a comment about risking the display lifetime.
The actual LCD panel is always displaying 1680x1050=1.8Mpixels. It can fake 30% more pixels (1920x1200=2.3M) only if there are enough "empty" spaces between fine features in the picture so that every 7th pixel in each direction can be swallowed. So the panel itself and its analog drivers are actually seeing less than nominal stress because the good-looking mode runs at 50Hz, not 60. The only things in the monitor that run at a 9% higher clock rate (1920x1200x50=115MHz vs. 1680x1050x60=106MHz) are the DVI interface and the scaler engine. The DVI interface has to pick 1's and 0's out of the noise and crosstalk on the DVI cable, and if the voltages change too fast it makes errors and eventually fails to find the right timing altogether. The scaler is a digital signal processor that squeezes the 2.3MPixels per frame into 1.8MPixels for the panel, meaning that its input runs at 115MPix/s and its output at 106MPix/s. If the input side gets clocked too fast, its pipeline flip-flops will get setup and hold errors and it will send the wrong bits to the display. But only part of the chip gets overclocked, if at all. I suspect they may have used a processor chip meant for a larger panel and programmed it to serve the smaller one - otherwise why bother allowing me to squeeze a larger screen into the smaller panel? It just costs more Si real estate.
Things are very different in a CRT monitor where everything scales with the pixel clock, including the highest-power circuit in the box, the horizontal deflection driver. There you definitely worry about lifetime reduction from overclocking.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.