[SOLVED] Can I get rid of screen flickering in this situation?
SlackwareThis Forum is for the discussion of Slackware Linux.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Can I get rid of screen flickering in this situation?
I run Slackware64-14.2 with a NVidia Geforce video card and proprietary blob. Sometimes I play a game in windowed mode and whenever the game is running, if I switch to another workspace to take notes in gvim, the screen is flickering so bad that reading the text becomes hard. Without the game running there is no flickering at all. Perhaps there's some video driver / xserver setting that I could tune to get rid of this flickering?
If your nvidia card is new enough to use the modern (non-legacy) drivers you will very likely benefit from this article, --- Linux nVidia VSync, New Feature --- While it has to do more with screen tearing I suspect yours is just an extreme example of that. It might also be helpful to research how to fetch your monitor's EDID and load it by default so that you are getting the most effective "Screen" parameters for your specific "Monitor".
Incidentally, as in all GUI work, the checkbox is only a symbol for a cli command. If your card is too old for the newest drivers, you may still be able to accomplish the same thing with the underlying command like this ----
Code:
nvidia-settings --assign CurrentMetaMode="nvidia-auto-select +0+0 { ForceCompositionPipeline = On }"
If your card is too old or too weak it will cost you ~$75-$100 USD to get a newer, stronger one of decent quality. If that is the case, trust me, it would be money well spent. Most people are surprised how much powerful graphics systems increase the performance of your entire PC experience.
My apologies, I did not include the specs in the first post. I have a NVidia Geforce 730GT. Will read the article a bit later and will post feedback here.
It might also be helpful to research how to fetch your monitor's EDID and load it by default so that you are getting the most effective "Screen" parameters for your specific "Monitor".
I've already been through that during initial monitor setup. I believe I run the right modeline. It includes a '-hsync +vsync' part if that matters
Man, honestly I think you ask too much from the Linux gods.
I do not think the things will ends well while using an application using extensively the 3D, like a game, then you to expect your desktop with 3D effects to playing well.
Just because those things do not go well together, they interfere.
Last edited by Darth Vader; 12-18-2017 at 10:03 AM.
For example I have a GK110B and I would guess you have a GK107 or GK208?
Since you are using an older nvidia card have you considered trying to use nouveau? The caveats being that a recent kernel and versions of libdrm and mesa would significantly improve its usefulness. It will also still be an imperfect driver since nvidia does not release documentation or firmware in a timely manner. There will be some performance loss compared to nvidia, but with the nouveau reclocking feature (Needs at least a 4.10 kernel for best support I think?) and your specific needs you may not notice. Additionally talking with nouveau developers about potential issues is easy on irc and/or their issue tracker while effectively reporting issues to nvidia is nearly impossible unless you have connections.
I do not think the things will ends well while using an application using extensively the 3D, like a game, then you to expect your desktop with 3D effects to playing well.
You mean things not ending well as in having side effects like flickering or more serious issues like eventually getting my hardware toasted?
I think big DEs like KDE make use of 3D effects (I'm nost sure about it), but do simple window managers do that as well? I'm a humble dwm user since those times when I was managing to fill up all RAM with running apps and tried to save every megabyte of memory - now I didn't manage to do that (yet).
This suggestion helped, the flickering seems to be gone after I enabled Force Full Composition Pipeline in nvidia-settings. I will watch it closer for the next few days and if all goes well, I'll make the change persistent by sticking that command line call somewhere. Marking thread as solved.
As usual, not only I managed to fix the issue with the help of this very user friendly community, but also learned a few new things during the process. Thank you.
You mean things not ending well as in having side effects like flickering or more serious issues like eventually getting my hardware toasted?
I guess that's all about side effects. Even in the simple case of a 3D game vs. and a desktop pushing a video-driver, there is used a kind of 3D acceleration.
Quote:
Originally Posted by FlinchX
I think big DEs like KDE make use of 3D effects (I'm nost sure about it), but do simple window managers do that as well? I'm a humble dwm user since those times when I was managing to fill up all RAM with running apps and tried to save every megabyte of memory - now I didn't manage to do that (yet).
I do not think that DWM use 3D effects, but any driver other than the pure modesetting use a kind of hardware acceleration.
And a hint: a NVIDIA card has no 2D acceleration engine and no one, never has, they use the 3D engine too for this feature.
Hence, your video driver will touch the 3D, as demonstrated by your game changing "Force Full Composition Pipeline".
Last edited by Darth Vader; 12-19-2017 at 08:56 AM.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.