LinuxQuestions.org
Visit Jeremy's Blog.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 05-14-2013, 04:59 PM   #1
thebombzen
Member
 
Registered: Dec 2010
Location: Noneya Business
Distribution: Linux Mint
Posts: 56

Rep: Reputation: 5
Angry Disable NVidia PoweMizer on a Linux Desktop (Why's it there in the first place?)


If I sound irritated, it's because I am! So be warned.

I recently learned that NVidia's power-saving feature, PowerMizer, is enabled on my desktop and this irks me as anyone with any brains should realize that power saving is not really a high priority for desktop users as desktops are always plugged into AC power and have good cooling compared to laptops.

That being said, here's a screenshot of my NVidia X Server's PowerMizer settings:

http://i.imgur.com/A7LbLm5.png

Notice that adaptive clocking is enabled, performance level is 0, and the PowerMizer settings dropdown is set to "Adaptive."

What's even more egregious is these settings won't save. If I change the dropdown to "Prefer Maximum Performance," when I restart NVidia X Server Settings, it just sets right back to adaptive. I can't figure out how to turn off a power-saving setting on my desktop.

After searching this problem on Google, and tried many of the fixes, none of them seem to work, but maybe that's because most fixes seemed targeted at laptops. I tried this:
Quote:
Create a file called /etc/modprobe.d/nvidia.conf with this in it:

options nvidia NVreg_RegistryDwords="PerfLevelSrc=0x2222"
I tried adding all these to my xorg.conf under "Display" (at different times of course):
Quote:
Option "RegistryDwords" "PowerMizerEnable=0x1; PerfLevelSrc=0x3322; PowerMizerDefaultAC=0x1"
Option "RegistryDwords" "PowerMizerEnable=0x1; PerfLevelSrc=0x2222; PowerMizerDefaultAC=0x1"
Option "RegistryDwords" "PowerMizerEnable=0x1; PowerMizerLevel=0x1; PowerMizerLevelAC=0x1"
Option "RegistryDwords" "PowerMizerEnable=0x1; PerfLevelSrc=0x3322; PowerMizerDefaultAC=0x1; PowerMizerLevel=0x1; PowerMizerLevelAC=0x1"
Option "RegistryDwords" "PowerMizerEnable=0x0; PerfLevelSrc=0x3322; PowerMizerDefaultAC=0x1; PowerMizerLevel=0x1; PowerMizerLevelAC=0x1"
Option "RegistryDwords" "PowerMizerEnable=0x0"
But none of these worked. Performance level was still locked at 0, and adaptive clocking enabled with settings set to "Adaptive."

Does anyone know how I can fix this stupid stupid power-saving load of garbage for my desktop? Thanks in advance for your reply.

Last edited by thebombzen; 05-14-2013 at 05:08 PM.
 
Old 05-14-2013, 05:27 PM   #2
273
LQ Addict
 
Registered: Dec 2011
Location: UK
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,680

Rep: Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373
What issues is it causing you, may I ask? I've not noticed any problems leaving it on "Adaptive".
However, one thing you could try adding
Code:
Option "Coolbits" "1"
to your xorg.conf since this enabled overclocking it may also make your choice stick.
The other thing to try is to "Save Current Configuration" before closing nvidia-settings.
By the way, running your card at full speed if you leave your PC on for hours could get very, very hot and cost you a few dollars more in electricity. The 8600 probably isn't a fast enough card to burst into flames like its stablemates have been known to but I would still be even more careful about dust buildup if you're running it at 100% constantly.
 
1 members found this post helpful.
Old 05-15-2013, 01:02 PM   #3
thebombzen
Member
 
Registered: Dec 2010
Location: Noneya Business
Distribution: Linux Mint
Posts: 56

Original Poster
Rep: Reputation: 5
Quote:
Originally Posted by 273 View Post
What issues is it causing you, may I ask? I've not noticed any problems leaving it on "Adaptive".
I think it's slowing down the graphics on my machine. However, I don't know the alternative because apparently this has been on the whole time.

Quote:
Originally Posted by 273 View Post
However, one thing you could try adding
Code:
Option "Coolbits" "1"
to your xorg.conf since this enabled overclocking it may also make your choice stick.
The other thing to try is to "Save Current Configuration" before closing nvidia-settings.
I tried both of these, to no avail.

Quote:
Originally Posted by 273 View Post
By the way, running your card at full speed if you leave your PC on for hours could get very, very hot and cost you a few dollars more in electricity. The 8600 probably isn't a fast enough card to burst into flames like its stablemates have been known to but I would still be even more careful about dust buildup if you're running it at 100% constantly.
I'm not interested in running my card at full speed, I'm just interested in having it not shut down when I'm doing less-intensive stuff.
 
Old 05-15-2013, 01:09 PM   #4
273
LQ Addict
 
Registered: Dec 2011
Location: UK
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,680

Rep: Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373
I'm not sure that the card slowing down would affect anything. If you're running something using the card then it should stay at full speed until you stop running that, if it's not I would call that a fault. If you're not running games or high-definition video then you won't need the card and so it will run slow.
I don't see a use case where it slowing down would cause problems.
 
Old 05-17-2013, 02:41 PM   #5
thebombzen
Member
 
Registered: Dec 2010
Location: Noneya Business
Distribution: Linux Mint
Posts: 56

Original Poster
Rep: Reputation: 5
I did a quick Google images search on "NVidia X Server Settings PowerMizer" and I got others' images of the screen, which appear to have multiple performance levels. My screen only has one performance level, which reads at my GPU's clock speed, so it's possible I've been an idiot, it's been disabled this entire time, and the setting Adaptive/Maximum Performance does nothing because I only have one performance level. Is this likely?
 
Old 05-17-2013, 02:47 PM   #6
273
LQ Addict
 
Registered: Dec 2011
Location: UK
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,680

Rep: Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373
That is entirely possible, yes. Just check that the clock speed your card is rated at is the one it is shown to be running at. I'm not sure when the nVIDIA power settings came in but thinking about it my 9800 makes a big deal of it so perhaps that was the first generation to have it in desktop cards.
 
Old 05-18-2013, 08:44 AM   #7
10110111
Member
 
Registered: Jun 2008
Location: St.-Petersburg, Russia
Distribution: (B)LFS, Ubuntu, SliTaz
Posts: 403

Rep: Reputation: 51
You indeed have only one level of performance. See how it looks on my card with 4 levels: link. If I change from adaptive to max performance, then it goes to level 3 and will keep there. And on adaptive there's some jumpy moving windows on window drag, which then goes better when level goes higher.
 
Old 05-18-2013, 10:43 AM   #8
thebombzen
Member
 
Registered: Dec 2010
Location: Noneya Business
Distribution: Linux Mint
Posts: 56

Original Poster
Rep: Reputation: 5
Quote:
Originally Posted by 10110111 View Post
You indeed have only one level of performance. See how it looks on my card with 4 levels: link. If I change from adaptive to max performance, then it goes to level 3 and will keep there. And on adaptive there's some jumpy moving windows on window drag, which then goes better when level goes higher.
Well now I just feel stupid. Anyway, marked as solved.
 
Old 05-18-2013, 10:46 AM   #9
273
LQ Addict
 
Registered: Dec 2011
Location: UK
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,680

Rep: Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373
The GUI, being the same for all cards, isn't all that clear on the point. I think it would have taken me a while to work it out.
 
  


Reply

Tags
nvidia, settings, xorg



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
LXer: Linux Desktop Space is no Place to Concede LXer Syndicated Linux News 0 05-18-2012 09:00 AM
I place a fire-started ,and ports come disable bluebirdpr Linux - Newbie 4 03-18-2010 11:14 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 07:21 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration