How's this for weird.........
My system:
i5-3570K CPU
Nvidia 670 GPU
32 gigs of DDR 3 RAM
SSD drive
My PC is plugged into a UPS (along with a media server). When I'm in Windows 7, Linux Mint 18.1, Manjaro 17.0.1 or Ubuntu 16.04.2, just sitting at the desktop.... GPU temp is around 30 Celsius and UPS WATT usage is around 270-300.
When I load Insurgency and start a map, (I only see this problem in Linux), the GPU temp climbs almost instantly to 50+ Celsius and WATTAGE use goes from roughly 270 to 470 and then the UPS fan kicks in. I minimize the game and the WATT usage drops off, the UPS fan stops and everything is quiet. Go back into the game and UPS fan kicks in because of the load going back into the 470 WATT range..... I'm sure this is because of the GPU drawing more power.
So in a nutshell, while in Windows and playing Insurgency;
GPU temp=30-40C
WATT usage=311
UPS fan is off
Normal
In Mint/Manjaro/Ubuntu playing Insurgency;
GPU temp=50+C
WATT usage=470
UPS fan kicks into high gear and things get noisy
Not normal. Well, I don't remember having this problem a LONG time ago when I was playing Insurgency in Linux. I haven't touched the game in months and months and decided to fire it up and noticed all this weirdness.
It looks like the problem is with the Linux Nvidia 375.39 driver pushing the GPU to max. (I'm fairly certain that this is the driver version I installed on all 3 Linux distros.)
I'm going to go and boot back into Mint and switch the driver to the open source one and see if that fixes it. If so, maybe I'll manually install the 375.66 version or 378.09 Beta.
Anyone ever seen this before? Know of a fix?
UPDATE: Switching to the OpenSource driver fixed the "GPU out of control" problem. But my FPS dropped from 150 to 50.
I updated to 375.66 and same problem. I guess this explains why this USE to work.... was using an earlier Nvidia driver. When I get some more time, looks like I'll have to drop back to an older Nvidia driver. Why do GPU manufacturers write such shitty Linux drivers?
On another note.... is there a way to edit the .conf file to force the GPU to not go into "Max Performance" mode? I checked in the Nvidia settings applet and PowerMizer was set to Auto. I switched it to Adaptive and same result.