I have used only Raspberry Pis for several years, I didn't even think about it. They're mostly the same as Linux on an i386. This happens to be a Pi 4, 8 GB. "Desktop" is LXDE, running Debian 12 Bookworm.
This I think is more an issue with doing xrandr transforms, which are related to xinput transforms, also related to an identity matrix
https://en.wikipedia.org/wiki/Identity_matrix. It's linear algebra, I was a Calculus dropout.
It requires feeding xrandr a line like:
Code:
xrandr --output "HDMI-1" --fb 1920x1080 --transform 1,-48,0,-48,0,48,0,48,0
and understanding what the numbers mean. A cookbook approach would probably work fine ("this number neans .." and this number means that) if I could find it described clearly. What I need the identity matrix for is to let me scale up the data I have (maybe 1800x950 pixels) to fit a 1920x1080 screen.
The scale function doesn't work for this: I can play with scale but the picture on the screen always has 1/2 inch block border. I made a set of scripts like
Code:
xrandr --output HDMI-1 --scale 1.2x1.2
and
Code:
xrandr --output HDMI-1 --scale 0.8x0.8
but that black border doesn't go away, other things change instead. If you run xrandr its default action is to list some parameters, the current size may change from 1920 to 2500 or so but it's still surrounded by black. Trying arandr right now.
I thought I had a bright idea: I'd just take a screen shot then load it into Gimp and measure. What happened was that I made a 1920x1080 image but it's bigger than my screen (because I don't see the whole screen), I can't load it and measure to the edges. arandr doesn't have very good documentation but that gave me the screen shot idea. Having an actual measurement (not 1920x1080) of my screen size would be useful.
https://imgur.com/onRtQJz
https://imgur.com/onRtQJz