[SOLVED] Black Screen when trying to use interlaced resolutions
Linux - HardwareThis forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Distribution: Garuda, Fedora, Ubuntu Server, Debian, Pop OS, Batocera, Zorin
Posts: 5
Rep:
Black Screen when trying to use interlaced resolutions
Hello!
I am new to LinuxQuestions - please correct me if I didn't include something or posted to the wrong forum.
I am running a system with a R9 270 GPU connected to a CRT monitor with a passive DVI-I to VGA adapter. Everything works just fine until I try to use xrandr to switch to an interlaced resolution. The command executes without any errors but my screen turns black and stays there until I switch back to the progressive mode I was using before.
I know the issue isn't with the adapter since I have used it before with an Nvidia GPU and the same monitor, and it was able to output an interlaced signal.
I have no idea how to get this working. Any help would be greatly appreciated!
I am new to LinuxQuestions - please correct me if I didn't include something or posted to the wrong forum.
I am running a system with a R9 270 GPU connected to a CRT monitor with a passive DVI-I to VGA adapter. Everything works just fine until I try to use xrandr to switch to an interlaced resolution. The command executes without any errors but my screen turns black and stays there until I switch back to the progressive mode I was using before.
I know the issue isn't with the adapter since I have used it before with an Nvidia GPU and the same monitor, and it was able to output an interlaced signal.
I have no idea how to get this working. Any help would be greatly appreciated!
Assuming you did nothing wrong, Something in the video chain does not support that mode.
Distribution: Garuda, Fedora, Ubuntu Server, Debian, Pop OS, Batocera, Zorin
Posts: 5
Original Poster
Rep:
Quote:
Originally Posted by wpeckham
Assuming you did nothing wrong, Something in the video chain does not support that mode.
Thank you for the reply. Yeah I guess it would have to be the GPU then. I was convinced that the R9 200 Series can do interlaced over DVI but maybe I am mistaken. Or maybe there's something wrong with the AMD driver on Linux. I tried installing Windows to verify that, unfortunately I am unable to install the AMD software over there, and adding custom resolutions with 3rd party programs doesn't seem to work.
Could xrandr cause a problem because of your error or a needed parameter missing? Provide an example used that results in black, another that doesn't.
The adapter could be disallowing EDID to come through. A config file in /etc/X11/xorg.conf.d/ might be a solution, like from this modified default 50-monitor.conf file from many moons ago:
Code:
Section "Monitor"
Identifier "Default Monitor"
## If your monitor doesn't support DDC you may override the
## defaults here
HorizSync 28-85
VertRefresh 50-100
Option "PreferredMode" "1600x1200"
## Add your mode lines here if necessary, use e.g the cvt tool
EndSection
Conform the values above to your monitor and give it a try. Ignore the modelines. Xorg is just as good a mode calculator as any external tool, given appropriate configfile values. man xorg.conf
Distribution: Garuda, Fedora, Ubuntu Server, Debian, Pop OS, Batocera, Zorin
Posts: 5
Original Poster
Rep:
Quote:
Originally Posted by mrmazda
Could xrandr cause a problem because of your error or a needed parameter missing? Provide an example used that results in black, another that doesn't.
The adapter could be disallowing EDID to come through. A config file in /etc/X11/xorg.conf.d/ might be a solution, like from this modified default 50-monitor.conf file from many moons ago:
Code:
Section "Monitor"
Identifier "Default Monitor"
## If your monitor doesn't support DDC you may override the
## defaults here
HorizSync 28-85
VertRefresh 50-100
Option "PreferredMode" "1600x1200"
## Add your mode lines here if necessary, use e.g the cvt tool
EndSection
Conform the values above to your monitor and give it a try. Ignore the modelines. Xorg is just as good a mode calculator as any external tool, given appropriate configfile values. man xorg.conf
Thank you for the suggestion! I cannot get this xorg.conf method to work, how can the config output a new resolution without me specifying the modeline? I must be missing something but it doesn't really do anything unless I put in a mode that xrandr already supports out of the box. However while playing around I found out that if I try to interlace a super high resolution such as 1600x1200 the monitor actually outputs an image! So yes the GPU is capable of it, just not at lower resolutions I suppose.
I read somewhere online that there are Nvidia GPUs, which cannot output any signal at lower resolutions as there is not enough bandwidth going out of them. Maybe the same is true for my AMD card? If that's the case I'm probably gonna have to figure out how to send more signal out without displaying these high resolutions.
how can the config output a new resolution without me specifying the modeline?
That's why I suggested the man page. I haven't messed with CRT configuration since probably a decade ago, got rid of my last ones 3 years ago.
What does xrandr output when using an xorg.conf file? Any difference from without? Did you try using xrandr with an xorg.conf in place?
I'm thinking the passive adapter must be the most likely obstacle. If you had to switch to an active adapter, your better way forward would be to switch to a flat screen and start saving on your electric and heating bills. Cheap route: put your old NVidia in the new PC.
What difference did you actually observe between progressive and interlaced when you were using the NVidia?
What distro and version are you using? A different one might make a difference.
Distribution: Garuda, Fedora, Ubuntu Server, Debian, Pop OS, Batocera, Zorin
Posts: 5
Original Poster
Rep:
Quote:
Originally Posted by mrmazda
That's why I suggested the man page. I haven't messed with CRT configuration since probably a decade ago, got rid of my last ones 3 years ago.
What does xrandr output when using an xorg.conf file? Any difference from without? Did you try using xrandr with an xorg.conf in place?
I'm thinking the passive adapter must be the most likely obstacle. If you had to switch to an active adapter, your better way forward would be to switch to a flat screen and start saving on your electric and heating bills. Cheap route: put your old NVidia in the new PC.
What difference did you actually observe between progressive and interlaced when you were using the NVidia?
What distro and version are you using? A different one might make a difference.
With the xorg.conf xrandr displays that it's using the mode I specified in the config for resolutions that already were available beforehand. When I try other resolutions it just outputs the same thing it always does, no change really. But perhaps I just don't understand how to format that file correctly...maybe you can tell me how it should look like if my Horizontal range is 30-96KHz, the Vertical Refresh is 50-160 and my video output is DVI-0 (targeting 600i for example).
I also have an active DVI-D to VGA adapter but surprisingly I am getting the same behavior (no interlacing possible) with it, the only difference being there are more resolutions available in xrandr by default. I tried on a couple different distros to check if that would change something (it didn't), but I am mostly testing on Mint right now. When it comes to the Nvidia GPU it displayed both interlaced and progressive without issues as long as I was within the supported hz zone of my monitor. However I had another problem with it, where it was constantly giving me annoying micro stutters in games, specifically using the proprietary 470 driver and this CRT (the open source Nouveau driver was running just fine but obviously performed terribly, the issue also didn't occur with different monitors). In summary this particular CRT is giving me a lot of headaches and yeah I honestly am wondering if I should just change it out for another one but I really hate throwing away working electronics, it just makes the tech enthusiast inside me cry .
In summary this particular CRT is giving me a lot of headaches and yeah I honestly am wondering if I should just change it out for another one but I really hate throwing away working electronics, it just makes the tech enthusiast inside me cry .
I agonized over that for several years because all mine were large and working well, but they just can't compare to a quality flat panel and digital connectivity. I put most out by the curb as junk, and gave my best one to an antique PC collector. Also it's nice the space recovered from their removal, and extra width of digital desktop space.
Distribution: Garuda, Fedora, Ubuntu Server, Debian, Pop OS, Batocera, Zorin
Posts: 5
Original Poster
Rep:
Ok wow, I have finally found a solution I think. Not a good one but it works. So what I did is I searched the internet for other people's modelines and I found some guy who posted his interlaced modeline he used on a 15KHz monitor and with different parameters than I generated with CVT. I then copied those values to my xrandr and kept changing the dotclock frequency to random values. After many attempts one mode suddenly worked. I now have a great looking interlaced resolution and everything seems to be working perfectly. However,
DISCLAIMER: to anyone reading this trying to copy what I did - DON'T...or at least educate yourself before.
I am obviously no expert on the matter but, according to what I read online, messing with modelines like this without knowing what you're doing may damage your monitor. I only did this since I had given up hope and was going to throw away the CRT anyway. At least do your own research.
Thank you guys for commenting your thoughts and suggestions, it's always nice to know you're not alone when trying to solve frustrating hardware issues like this one.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.