LinuxQuestions.org
Visit Jeremy's Blog.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Slackware
User Name
Password
Slackware This Forum is for the discussion of Slackware Linux.

Notices


Reply
  Search this Thread
Old 12-12-2015, 05:26 PM   #61
enorbet
Senior Member
 
Registered: Jun 2003
Location: Virginia
Distribution: Slackware = Main OpSys
Posts: 4,784

Rep: Reputation: 4434Reputation: 4434Reputation: 4434Reputation: 4434Reputation: 4434Reputation: 4434Reputation: 4434Reputation: 4434Reputation: 4434Reputation: 4434Reputation: 4434

@mattallmill - Primitive features? I hesitate to mention this because I don't want to insult you with something so basic but then again probably it is only so basic to me because I have used nVidia exclusively for over 20 years. I am referring to "nvidia-settings", initially run as root for system wide changes and the ability to write directly to "/etc/X11/xorg.conf". After it is setup you can look at "persistence" which has either a daemon, "nvidia-persistenced", or an ".rc" user file to maintain settings as permanent. The information included in nvidia-settings is extremely valuable (even GPU temps) and most configuration can be done right from it. I don't have ATi for a comparison, especially in Linux, but I'd be hard-pressed to imagine a more complete and useful utility. If you re already familiar with nvidia-settings and I'm "bringing coals to NewCastle", my sincere apologies, but I figure it is best not to leave stones unturned.

Also, while I know the Readme.txt is huge, around 0.5MB, it is also well organized and chock full of explanations and examples for so much I can't imagine it wouldn't be helpful to you. If you've not seen it you can place the "NVIDIA-foo.run" file in a convenient directory and run it with the "--extract-only" switch so that it simply unpacks it in that directory.

Example:
Code:
sh ./NVIDIA-x86-352.63.run --extract-only
Hopefully, these tools will assist you in making your system exactly what you want it to be.
 
Old 12-12-2015, 05:55 PM   #62
1337_powerslacker
Member
 
Registered: Nov 2009
Location: Kansas, USA
Distribution: Slackware64-15.0
Posts: 862
Blog Entries: 9

Rep: Reputation: 592Reputation: 592Reputation: 592Reputation: 592Reputation: 592Reputation: 592
Quote:
Originally Posted by enorbet View Post
@mattallmill - Primitive features? I hesitate to mention this because I don't want to insult you with something so basic but then again probably it is only so basic to me because I have used nVidia exclusively for over 20 years. I am referring to "nvidia-settings", initially run as root for system wide changes and the ability to write directly to "/etc/X11/xorg.conf". After it is setup you can look at "persistence" which has either a daemon, "nvidia-persistenced", or an ".rc" user file to maintain settings as permanent. The information included in nvidia-settings is extremely valuable (even GPU temps) and most configuration can be done right from it. I don't have ATi for a comparison, especially in Linux, but I'd be hard-pressed to imagine a more complete and useful utility. If you re already familiar with nvidia-settings and I'm "bringing coals to NewCastle", my sincere apologies, but I figure it is best not to leave stones unturned.

Also, while I know the Readme.txt is huge, around 0.5MB, it is also well organized and chock full of explanations and examples for so much I can't imagine it wouldn't be helpful to you. If you've not seen it you can place the "NVIDIA-foo.run" file in a convenient directory and run it with the "--extract-only" switch so that it simply unpacks it in that directory.

Example:
Code:
sh ./NVIDIA-x86-352.63.run --extract-only
Hopefully, these tools will assist you in making your system exactly what you want it to be.
Well, I ranted yesterday because I was so frustrated with getting Nvidia's driver to initialize the second card. However, I was able to get it to run, and after reflection, I realize that the nvidia-settings program is Nvidia's equivalent to aticonfig, and has much of the configurability options as well. I must confess that I thought the unimaginable to the folks here: I actually briefly entertained thoughts of installing that other OS so that I could actually get my second monitor to run; I was that frustrated. But only briefly. In retrospect, the configuration utilities are not so bad, just different enough to really throw me off.
 
Old 12-12-2015, 06:04 PM   #63
genss
Member
 
Registered: Nov 2013
Posts: 741

Rep: Reputation: Disabled
ftp://download.nvidia.com/XFree86/Li...358.16/README/
see index.html
 
Old 12-12-2015, 06:36 PM   #64
enorbet
Senior Member
 
Registered: Jun 2003
Location: Virginia
Distribution: Slackware = Main OpSys
Posts: 4,784

Rep: Reputation: 4434Reputation: 4434Reputation: 4434Reputation: 4434Reputation: 4434Reputation: 4434Reputation: 4434Reputation: 4434Reputation: 4434Reputation: 4434Reputation: 4434
Quote:
Originally Posted by gezley View Post
Just be careful with some of the high-end CPU fan coolers: if the fan(s) is/are side-mounted on a tower heatsink you might end up having little or no airflow over the capacitors surrounding the socket on the motherboard. The basic heatsink and fan supplied with many Intel processors, facing down, can actually be preferable if you don't have case fans to keep these caps (and the memory nearby) cool as well.
Somewhat true which is why I mentioned at least one fan sucking into the case at lower front and at least one other blowing out at upper back. Those capacitors are important because

Quote:
Originally Posted by eetimes.com
The rate at which the electrolytic evaporates is a strong function of the capacitor’s temperature. For every 10 degree Centigrade decrease in operating temperature, the capacitor life is extended by a factor of two
However those capacitors have a rated lifespan spec'd at 105C (typically 1000-2000 hours), substantially higher than I hope anyone allows their equipment to reach. Those caps don't generate substantial heat, it comes mostly from the nearby voltage regulator chips (commonly those with small, finned heatsink radiators bolted on, so it is important that there is good airflow over those chips.

If one desires a quiet as well as cool PC, it is advisable to use many larger, slower rotating fans. IOW get the airflow not with high rpms (one bad method for increased CFM) but with larger diameter, low rpm fans, placed so that natural convection (heat rises) is assisted.

If you are a thermal freak, like me, you can remove the side cover and either use a proper meter or even your fingers to determine hot spots. It is not uncommon for manufacturers to skimp on fans and heatsinks. It used to be that Northbridge and Southbridge chips were commonly neglected or under-cooled and if you own such a mobo it is very cheap to fix as there are high performance copper heatsinks available to improve cooling on those smaller but vastly important chips. Most have double-sided thermal tape which is not ideal but way better than no, or poor, heatsinks. Additionally if you suspect poor airflow across the face of your mobo that is easily solved if you have moderate mechanical abilities by cutting a hole in the removable side panel (the one facing the plane of the mobo) and mounting a fan there, blowing directly across the face of the entire mobo.

I go to even further extremes, like lapping and polishing CPU and Heatsink faces and employing high performance thermal transfer compound and the results speak for themselves. My main box, this one I'm currently typing on, is an Asrock Z77 Extreme with an i5-3550 CPU (w/ scaling disabled) and an nVidia GTX-760 and I now have 180 running processes and some 30+ tabs open in Firefox, yet my CPU is at 35C and even my GPU is at only 36C (basically like a very warm Summer day). 60C is the threshold of human pain and I never want anything in my PC to get past 65C-70C and it doesn't. 105C? No freakin' way! I WANT that 4000-16000 hours extension on MTBF plus the data stability low temps affords.
 
1 members found this post helpful.
Old 12-12-2015, 09:03 PM   #65
Gerard Lally
Senior Member
 
Registered: Sep 2009
Location: Leinster, IE
Distribution: Slackware, NetBSD
Posts: 2,177

Rep: Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761
Quote:
Originally Posted by enorbet View Post
I go to even further extremes, like lapping and polishing CPU and Heatsink faces and employing high performance thermal transfer compound and the results speak for themselves.
(With apologies to OP for taking this thread OT) enorbet: which thermal compound do you recommend? I'm building an AMD 8-core system in the near future (to handle multiple virtual machines). Never comfortable with liquid cooling so likely to go with one of the Noctua fans/heatsinks.
 
Old 12-12-2015, 09:41 PM   #66
1337_powerslacker
Member
 
Registered: Nov 2009
Location: Kansas, USA
Distribution: Slackware64-15.0
Posts: 862
Blog Entries: 9

Rep: Reputation: 592Reputation: 592Reputation: 592Reputation: 592Reputation: 592Reputation: 592
Quote:
Thanks. I got the hardware in today, and am experiencing a multitude of unrelated issues; at first, I suspected that it was related to the new hardware, but recent events have led me to believe that it is just a result of having one of those days from hades. I'm taking a break right now, both to let my CPU and myself cool down; my CPU because I suspected it overheated because I left the case cover off too long, and myself because my patience is at an end, and I'm afraid I'll break something before too long if I continue on as I have been.

I'll update when I get some major progress made.
I found out what the cause of the freezing was, and it was not heat-related, as I first suspected. I have a case that has 3 intake fans & 3 exhaust fans, and never before has my system exhibited any sort of instability. What I finally ended up doing was disabling Turbo mode in the motherboard's BIOS. I wanted to test both the CPU by itself, and the CPU/GPUs combination. So what I did was a) compile a new kernel (4.3.2; CPU-intensive), and b) played about 90 minutes of Darkplaces Quake w/ Epsilon, which all settings turned up (CPU/GPU combo). Neither test proved my system to be the least unstable. So, what I have concluded is that the Turbo mode was causing the freezing. I can live without the extra speed, for the sake of stability.

As a side note, I thought Quake looked beautiful with my old R9 270, albeit with stuttering/low frame rates in places. These graphics cards absolutely blow the old AMD card away in terms of graphical rendering and framerates. I turned on nightmare mode, and of course the graphical settings were maxed out. Even at the places where I specifically remember stuttering with my old Radeon card, there is absolutely none with the GTX 970s. At no time did my FPS dip below mid-20s, in the aforementioned scenes. If graphics cards had emotions, I would seriously think that the GTX 270 twins would be laughing at my pathetic attempt to stress them (as if that would happen).

Conclusion: Absolutely gorgeous rendering at buttery-smooth framerates. Yep, the graphics system upgrade was definitely a good purchase. I am grateful to have had the funds available.

@enorbet: You were right. It truly is worth it to upgrade the graphics system, as it really does balance out the load between CPU and GPUs.

Couldn't be a happier camper if I tried!

Regards,

Matt

Last edited by 1337_powerslacker; 12-12-2015 at 10:24 PM. Reason: Clarification
 
Old 12-13-2015, 02:28 AM   #67
kingbeowulf
Senior Member
 
Registered: Oct 2003
Location: WA
Distribution: Slackware
Posts: 1,266
Blog Entries: 11

Rep: Reputation: 744Reputation: 744Reputation: 744Reputation: 744Reputation: 744Reputation: 744Reputation: 744
Quote:
Originally Posted by TobiSGD View Post
As a gamer, I never found that SLI is a solution I would try, simply because not every game runs faster with SLI, which means that just investing the money you would need for the second card into getting a better card is a more reliable solution. Not to mention that with modern games in most cases the bottleneck isn't the GPU anymore, but the CPU. For example, with the just released Grid Autosport in 1920x1080 using the "Ultra" preset for graphics option the GTX980 Ti is not even 5 FPS faster than my trusty old GTX760.
Indeed. My GTX660 wasn't even warming up and fan stayed at 40%, in any game, any engine, until I finally dumped the old wheezing Athlon64 X2 with 4GB RAM for an i7-5820K with 16GB RAM. I single GPU solution is simplest, and can still run 2 monitors just fine.
 
Old 12-13-2015, 05:25 PM   #68
1337_powerslacker
Member
 
Registered: Nov 2009
Location: Kansas, USA
Distribution: Slackware64-15.0
Posts: 862
Blog Entries: 9

Rep: Reputation: 592Reputation: 592Reputation: 592Reputation: 592Reputation: 592Reputation: 592
Quote:
Originally Posted by enorbet View Post
@mattallmill - Primitive features? I hesitate to mention this because I don't want to insult you with something so basic but then again probably it is only so basic to me because I have used nVidia exclusively for over 20 years.
To be honest, my choice of graphics was influenced by a friend of a friend, who is a serious gamer. I mentioned my first choice of graphics card, an R9 380, and he immediately downplayed it, saying that Nvidia was a much better choice for gaming. As I have owned Nvidia cards in the past, I have nothing against them, but simply migrated over to Radeon over the years. What really threw me off, though, was having the past rush back to you. I remembered the awesome hardware & drivers, but forgot how much work Nvidia card configuration requires.

I'm not saying this because I am discouraged by the frustrations of configuration, but simply bring it up because of the fact that it's been so many years since I handled an Nvidia card. I must say I tremendously enjoyed the experience of hefting not one, but 2 GTX 970s, and reflecting on how grateful I am to have this opportunity. Not for one minute will I let a relatively minor frustration like configuring dual monitors to put me off the experience of pure graphical bliss. Did I mention before that Quake I was much better than on my AMD card? Well, I was still experiencing 100+ FPS. I thought to check some more of the Darkplaces settings to see if I truly maxed things out. I thought I did that before because I turned on real-world lighting. Apparently, there are a few more settings to adjust to enhance the experience. After having fiddled with the settings, I could scarcely believe what my eyes were telling me. For a game released in 1996, there are truly some talented people out there dedicated to extending the life of this awesome game. It looks almost like a game designed just recently, not in the mid-90s. You can tell it's an old game from its behavior, but the graphics are truly out of this world. The Darkplaces engine can be configured to push high-end 2015 graphics cards, and respect their capabilities; by that, I mean that the game stresses the hardware like the engineers designed the hardware to do. I mean, they would hardly need to design new hardware constantly if games didn't constantly push the boundaries. It's a win-win-win situation. The hardware engineers win because they have a constant market for their new designs; the game developers win because they can use recently released hardware to make games that show off the hardware to its fullest; and finally, the customer wins because they can enjoy the efforts of both the engineer and developer, and continue to raise their expectations for good game play.

As you can probably tell, I am crazy about this game, and consider the purchase of the graphics cards to be a sound one, now and for the long run. But getting familiar again with the configuration utilities is going to get some getting used to.
 
Old 12-13-2015, 06:20 PM   #69
TobiSGD
Moderator
 
Registered: Dec 2009
Location: Germany
Distribution: Whatever fits the task best
Posts: 17,148
Blog Entries: 2

Rep: Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886
I seriously would be surprised if a single GTX970 couldn't handle that game. Unless you are into 4K gaming there is currently not a single game, even AAA games, that such a card would have any problems with.
 
Old 12-13-2015, 06:23 PM   #70
1337_powerslacker
Member
 
Registered: Nov 2009
Location: Kansas, USA
Distribution: Slackware64-15.0
Posts: 862
Blog Entries: 9

Rep: Reputation: 592Reputation: 592Reputation: 592Reputation: 592Reputation: 592Reputation: 592
Quote:
Originally Posted by TobiSGD View Post
I seriously would be surprised if a single GTX970 couldn't handle that game. Unless you are into 4K gaming there is currently not a single game, even AAA games, that such a card would have any problems with.
Indeed you are correct, but again, I still enjoy the creature comfort of having 2 such cards in my system, even if they can't be used to their full potential just yet.
 
Old 12-13-2015, 06:32 PM   #71
Nille_kungen
Member
 
Registered: Jul 2005
Distribution: Slackware64-current
Posts: 587

Rep: Reputation: 201Reputation: 201Reputation: 201
Quote:
Originally Posted by kingbeowulf View Post
Indeed. My GTX660 wasn't even warming up and fan stayed at 40%, in any game, any engine, until I finally dumped the old wheezing Athlon64 X2 with 4GB RAM for an i7-5820K with 16GB RAM. I single GPU solution is simplest, and can still run 2 monitors just fine.
Well it might not only have been the CPU then, it might be the new more and faster memory and buses since you also changed motherboard and memory.
If you go from a single to more memory modules then it often gets faster (don't know if that was the case for you).
I believe that all things together would have made a big difference.
No-one should be surprised that changing from an old pretty low end cpu (by today's standard) to an new high-end cpu improves performance.
 
Old 12-13-2015, 06:34 PM   #72
TobiSGD
Moderator
 
Registered: Dec 2009
Location: Germany
Distribution: Whatever fits the task best
Posts: 17,148
Blog Entries: 2

Rep: Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886
Quote:
Originally Posted by mattallmill View Post
Indeed you are correct, but again, I still enjoy the creature comfort of having 2 such cards in my system, even if they can't be used to their full potential just yet.
Of course it is up to you what to do with your money. I am just saying that I would have invested those extra 350$ in something that gives you an actual advantage now and not a "maybe" advantage in the future. Maybe a faster CPU or a couple of SSDs or something like that.
 
Old 12-13-2015, 09:24 PM   #73
1337_powerslacker
Member
 
Registered: Nov 2009
Location: Kansas, USA
Distribution: Slackware64-15.0
Posts: 862
Blog Entries: 9

Rep: Reputation: 592Reputation: 592Reputation: 592Reputation: 592Reputation: 592Reputation: 592
Quote:
Originally Posted by TobiSGD View Post
Of course it is up to you what to do with your money. I am just saying that I would have invested those extra 350$ in something that gives you an actual advantage now and not a "maybe" advantage in the future. Maybe a faster CPU or a couple of SSDs or something like that.
As I said before, SLI technology was something I'd always wanted to play with, and gaming is not the only application for which these GPUs can be applied. To be honest, there isn't much use for a faster CPU (FX-8350 vs. FX-8320: .5 GHz difference) or 2 SSDs (I am installing one tomorrow as my system disk). So yes, from a pure practical standpoint, it's a waste of money. However, I think that my rig is future-proof. I will admit that it is somewhat from desire that I do this, but name someone who, at one point or another, did not do something just because he/she wanted to, and not for practicality's sake, and I'll tell you this person is probably concealing something he/she does not want you to know.

Just my
 
Old 12-13-2015, 10:44 PM   #74
Richard Cranium
Senior Member
 
Registered: Apr 2009
Location: McKinney, Texas
Distribution: Slackware64 15.0
Posts: 3,858

Rep: Reputation: 2225Reputation: 2225Reputation: 2225Reputation: 2225Reputation: 2225Reputation: 2225Reputation: 2225Reputation: 2225Reputation: 2225Reputation: 2225Reputation: 2225
If everyone wanted the exact same things, the world would be a boring place.

Have a blast with your new system.
 
Old 12-14-2015, 12:00 AM   #75
kingbeowulf
Senior Member
 
Registered: Oct 2003
Location: WA
Distribution: Slackware
Posts: 1,266
Blog Entries: 11

Rep: Reputation: 744Reputation: 744Reputation: 744Reputation: 744Reputation: 744Reputation: 744Reputation: 744
FYI, just posted updated Nvidia scripts to SBo for 14.1 (nvidia, legacy340, legacy304). These are updates for X.org server 1.18. If anyone uses them, please test also on -current. Let me knwo if nvidia-switch --remove still leaves floaters behind.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] Proprietary NVIDIA Driver Troubleshooting winter_ken Debian 3 05-03-2012 04:48 AM
nvidia proprietary driver in Fedora 16 ba$h Linux - Hardware 1 04-18-2012 11:56 PM
nvidia proprietary driver, nvidia-settings and how to configure panning bluebox Linux - Hardware 1 03-11-2011 03:21 PM
Has anyone tried the proprietary driver with an Nvidia gt310m? damgar Linux - Hardware 1 04-18-2010 03:05 AM
does the vesa console framebuffer driver conflict with the proprietary nvidia driver? mr.v. Linux - Hardware 2 01-28-2007 06:51 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Slackware

All times are GMT -5. The time now is 01:56 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration