LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Slackware (https://www.linuxquestions.org/questions/slackware-14/)
-   -   Time for a new motherboard (https://www.linuxquestions.org/questions/slackware-14/time-for-a-new-motherboard-4175610774/)

bassmadrigal 08-06-2017 12:06 PM

Quote:

Originally Posted by cwizardone (Post 5745190)
I could be wrong, wouldn't be the first time ;) , but I have had the impression Nvidia provides better Linux support than their competition.

I totally agree with you... in regards to proprietary drivers. But AMD does a lot of work on the open-source drivers (both the radeon and amdgpu drivers) and it is miles ahead of nouveau. AMD's proprietary drivers work great when the stars are aligned, but they seem to refuse to support the latest versions of X. This is why you can't use the fglrx driver in 14.2, because X is too new, and you can't use amdgpu-pro in -current, again, because X is too new. Nvidia tends to add support for the latest X fairly quick, and seems to have the widest range of support. But, overall, AMD's open-source drivers are quite impressive and work great.

enorbet 08-06-2017 12:28 PM

Right here in the microcosm of this thread we have people touting Intel, AMD/ATi, and nVidia and I think that's a good thing. What graphics system one prefers depends to a great deal on how we use our computers and how much work we are willing to do to get what we want or need, and to some extent how much we are willing to spend to get that. Some don't seem to think graphics is very important and I find that hard to understand since so much of how we relate and what we work with is vision oriented, not to mention how much work is now shared between GPU and CPU in the SOHO/Desktop experience. I recall actually gasping out loud when I first booted up OS/2 with a then new Matrox Millenium card because of how sharp and clear fonts and colors were. Later I replaced that card with an nVidia card that actually didn't do as good a job in 2D and I was only pleased with the tradeoff because 3D acceleration was hugely better on the nVidia card. Had I not been also a gamer I would have gone back to the Matrox.

Now that so much more of the Web and even some WM/DEs employ advanced 3D work the situation is not as simple unless you work in CLI only. Then, all you need is a good 2D card. If you use a WM/DE like Blackbox or it's derivatives 2D is still the better choice. If you like Gnome, KDE, or even Xfce or any of their derivatives some 3D support is pretty much a requirement.

All I'm trying to say is that in threads like this we rarely know what others expect from their PCs or what experiences led to their particular brand loyalty. So the best advice is know what you want to do, and maybe a little guesswork at what you may want 5 years from now, and get what appears to give you that with that caveat of an eye on the future. Buy at least a little bit of "headroom" or you will likely have hamstrung your work environment if not your wallet. If OP is sold on strictly onboard Intel, odds are that suits his purpose, assuming he is aware of what he may be giving up and that may be very little considering we can buy cards today for $50 USD that outperform 10 year old cards that cost 5 times that back then.

cwizardone 08-06-2017 12:56 PM

Well said!

upnort 08-06-2017 02:05 PM

Quote:

But, overall, AMD's open-source drivers are quite impressive and work great.
I was unaware of that, using only Nvidia and Intel the past many years. That does influence my thinking some. Thank you.

Quote:

Some don't seem to think graphics is very important and I find that hard to understand since so much of how we relate and what we work with is vision oriented
Yes, graphics are important, beginning with the drivers "just work." :D

For myself, as I don't do gaming, video rendering, 3D, CAD or the likes, I likely am looking for graphics that are "good enough." For the past many years all of my systems use onboard graphics. I have been content with that. Possibly though I don't know what I am missing by not using a discrete card. I am not against a discrete video card, especially if I want to tinker with pass-through with VMs, which is not a high priority, but for the most part I think onboard graphics fits my use case well enough.

In the end, onboard or discrete, all I want is not to have to deal with quirks and glitches. I still remember the days of manually tweaking xorg.conf files and mode lines. The stuff of nightmares! :)

Quote:

All I'm trying to say is that in threads like this we rarely know what others expect from their PCs or what experiences led to their particular brand loyalty
There is no "loyalty" on my part, although I agree with the sentiment of what you said. For me the loyalty, if anything, is being able to use my computer. After 35 years of using them, I have grown weary of the continual tinkering required to get the damned things to "just work." ;)

Quote:

If OP is sold on strictly onboard Intel, odds are that suits his purpose, assuming he is aware of what he may be giving up and that may be very little considering we can buy cards today for $50 USD that outperform 10 year old cards that cost 5 times that back then.
I started this thread with a presumption of not wanting to deal, at all, with proprietary drivers. Embedded therein was the presumption that with the computers I own, the onboard Intel video support has worked without issue. Whereas my boards with Nvidia have been a PITA in one way or another. Here is an example.

I am aware that these types of problems differ across motherboards, GPUs, desktop environments, different distros, different distro releases. The glitches people encounter is just maddening. And just in case somebody wants to play devil's advocate, yes, the same nonsense happens in Windows too -- just visit any Windows forum.

As this thread has progressed -- and I am grateful for all of the conversation, suggestions, and thought provoking comments, I have opened myself to not being so rigid as my original post. At this point, in some ways, I am now more confused because there are so many variables, of which graphics is just one.

Of course, the budget is limited, which eliminates a hell-bent approach of trying dozens of different boards and GPU options. The challenge is finding a congenial sweet spot and potential candidates living therein.

In an oddball "Maslow's Hierarchy" kind of sense, I do not need a new office desktop. I would like one though. My starting point seems to be simple: after using the same system for 10 years, just about anything I buy will be a magnitude or two faster. Possibly any discussion thereafter is little more than a classic separating of wheat and chaff. I don't know. :)

1337_powerslacker 08-06-2017 04:45 PM

Quote:

Originally Posted by enorbet (Post 5745197)
...the best advice is know what you want to do, and maybe a little guesswork at what you may want 5 years from now, and get what appears to give you that with that caveat of an eye on the future. Buy at least a little bit of "headroom" or you will likely have hamstrung your work environment if not your wallet.

This may be a bit off-topic, but given that some of the focus of this thread is on longevity, I'd like to share my experiences with selecting components for the long haul.

I built most of what I have now in December of 2015, and have incrementally improved it since then, as funds have become available. I am looking at graduation from university in December of this year, and soon the government will want its money back. Given that finding a job in my chosen field is not a given, I find myself needing to hold on to the vintage FX-8370 processor I purchased some time ago. To that end, I recently purchased a higher-end air cooler, replacing the stock AMD Wraith cooler, to help keep load temps down as I compile software (kernel,etc.) or do some gaming.

I have seen the reviews of Ryzen, and while the Bulldozer/Piledriver architecture glory days are long since past, for my work flow, the 8-core 4.0GHz processor is more than able to meet my needs, both now and in the foreseeable future. This has the splash-on effect of letting the new processor and motherboard architecture landscape mature in terms of pricing/performance, and Linux compatibility. Ryzen is still a brand-spanking new platform, and not all the quirks have been worked out.

For the future, I see my FX processor doing useful work 5+ years into the future. By such time, Ryzen should be well-matured, and I sincerely hope that AMD ends up giving Intel a serious run for its money with Ryzen. That would be major kick-ass!

Just my :twocents:

enorbet 08-06-2017 10:57 PM

Regarding AMD/ATi open source drivers, I concur. I haven't own an ATi card since my ISA All-in-Wonder on a Tandy 8086 until recently I inherited one and was pleasantly shocked how well it functioned even for some YouTube videos.

While I am also a rather avid gamer I do enjoy solid 3D acceleration for X Compositors effects like animations, transparencies (and their control) etc. In short it makes my desaktop feel snappy and sets a tone for quick but deliberate working. I have never had problems with nVidia proprietary drivers that I can remember. In fact their documentation and control set made it easy to fix a bad monitor (well, bad EDID) even back in the somewhat nightmarish days of modelines and dual monitor difficulties. Somewhere between Slackware 12.2 and 13.0 it just got easy with very little required in xorg.conf.

I do exercise considerable brand loyalty because coming from OS/2 rather than Windows, shortly after Warp 3, nVidia was the only company providing and updating graphics drivers, since Matrox ceased after awhile. When I migrated to Linux the same was true. So I am very loyal and they've never let me down. On a philosophical level I think the GPL was intended to allow for a mix of Open and Proprietary and I still think such a mix is important to the industry as a whole.

Regarding budget that seems to me to be a non-issue for non-gamers since the level of quality of even onboard graphics (despite a little extra noise) is very high these days. If one prefers the cooler quieter environment possible with an add-in card, unless you consider 50 bucks a major expense it is absolutely phenomenal the level of performance that can now be bought at that level. Even hi Q brand names like Asus, Evga, Msi, Gigabyte, etc. offer selections in that 50 buck range that are either fanless or very quiet fan-cooled, sporting as much as 2GB VRam and Pascal cores. Most can outperform the venerable 8800 GTX that cost 300 bucks new back in the day. Such devices are even suitable for moderately taxing ultra-modern gaming but handle Desktop effects and videos as smooth as melted butter.

One word of caution - PCs of any quality experience more glitching if they run hot. Be cool! whatever it takes. I know I'm a cooling nut but I have a pro background in electronics and heat is The Enemy. My 760 GTX is currently running at 39C. Under severe strain it might top 53C at the very most and thats with a 2016 game with medium-high settings. I strongly recommend setting up lmsensors or in the case of nVidia, nvidia-settings, to monitor temps. Know what you're asking of your PC.

upnort 08-07-2017 10:04 AM

Quote:

One word of caution - PCs of any quality experience more glitching if they run hot.
Good point. Heat means energy consumption, which is something I want to limit. Reducing energy consumption is one reason I began using onboard graphics many years ago.

Yesterday I noticed modern discrete fanless graphic cards that the heat sinks are huge, as cwizardone mentioned. Looking a tad bit deeper I notice that larger PSUs are required/recommended even when the card is fanless. My guess is even when the graphics card is fanless a decent chassis fan is a must.

As I do not do anything that stresses graphics chips, I probably would not need to worry about the heat as critically as others. Nonetheless, a consideration point. :)

Regnad Kcin 08-07-2017 08:42 PM

I have some micro-mini PC devices that use heatsinks on the CPU and the powersupply. They work ok with Windows7 but get too hot running slackware64. such systems are ok for lab and office staff but not ok for me.

My luggable machine has a 30cm x 40cm footprint and fits into large laptop bag. It now has 5 fans. The 1000W power supply has a 12 cm fan, and there is a 13 cm fan blowing air down onto the m-itx motherboard. The asus-nvidia_960 display has its own 8 cm fan and the Beijing DeepCool Captain 240 liquid-cpu-cooler has 2 x 12 cm fans.

The i7-7700k is running at 5 GHz and the memory at 3200 Mhz. Since I added the liquid-cooler it never goes into thermal throttling or becomes unstable.

Liquid cooling of the CPU is much more effective than I imagined it might be.

enorbet 08-08-2017 02:47 AM

@ upnort - Since we all know the cliche regarding "assume' I would very much like to know what temps your system reports under both average and high loads. Please do bear in mind that these points of heat are often literally microscopic and hundreds of times smaller than the sensors (which are also at considerable distance away from those points) which monitor them. Just because a manufacturer advertises that their chip won't self destruct until well over 100C doesn't mean it will last long or run with solid stability even at many degrees below that level. There is just no good justification on a SOHO/Desktop machine for allowing temps to exceed ~60C.... ever. If it's at 60C at the sensor you can bet it is MUCH higher at the sources.

Regnad Kcin 08-08-2017 07:14 AM

@enorbet

I used to run the simple desktop monitor frequently until it quit working a few issues of -current ago.

i used my i7-4770k 8 threads to run clustalo aligning DNA sequences day after day and it would frequently get up to 100C when crunching those sequences and it never failed. I gave it to my daughter when her older PC motherboard died recently and got this i7-7700k which I have watercooled. I put watercooling on the older i7-4770k for her and it goes great. no problems of any kind in more than 3 years, but i did burn up several power supplies until I started using 1000 Watt powersupplies. I use m-itx motherboards on my personal machine because i lug it around asia with me.

What you say seems to make sense but I ran my i7 in the red zone for 3 to 4 years and it just kept on truckin'.

My justification for running 'er hot is that we had work to do, and if we burned it up, well...

the3dfxdude 08-08-2017 10:38 AM

Quote:

Originally Posted by upnort (Post 5745481)
Good point. Heat means energy consumption, which is something I want to limit. Reducing energy consumption is one reason I began using onboard graphics many years ago.

Yesterday I noticed modern discrete fanless graphic cards that the heat sinks are huge, as cwizardone mentioned. Looking a tad bit deeper I notice that larger PSUs are required/recommended even when the card is fanless. My guess is even when the graphics card is fanless a decent chassis fan is a must.

As I do not do anything that stresses graphics chips, I probably would not need to worry about the heat as critically as others. Nonetheless, a consideration point. :)

Power and heat go hand-in-hand. When looking at a particular model #, find the power dissipation, and you'll know relatively how much cooling you'll need. In a heat sink, there is going to be some air flow required, but probably nothing more than a typical case can provide. So if you do get the higher end video, you'll need appropriate power supply, and case fans. I find case fans more reliable, quieter, and easier to replace. But I'd have to admit, I've been fanless (both Nvidia and ATI) since 2000. So you'd have to spend a little more to get a good GPU, and of course, don't try to short cut, and cheap out. But naturally, the passive cooled options are going to not be as expensive as the highest end stuff, that requires localized heat removal.

I wish I could give a recommendation on what to try, but I've not bought a discrete video card in 7 years now. I've only purchased an Intel J1900 board in the last couple years and it works fine. It is completely fanless system.

Regnad Kcin 08-08-2017 10:51 AM

I built two J1900 boxes. One drives a ELISA plate reader and the other is on the desk of a colleague. The colleague's unit is a ASROCK board and the one in the lab is a BIOSTAR. both work fine but the asrock is by far the nicest unit.

enorbet 08-08-2017 02:10 PM

Of course it is a given that "power and heat goes hand-in-hand" and it is also true that it is rather silly to own a 1000 hp muscle car to use for simple urban commute. However if you commonly need to do long hauls with lots of cargo it's ridiculous to buy a Geo Metro for that. Apologies for car analogies but they do apply and we all can relate. Similarly if you replace a 6 cylinder 200hp engine with a 400hp V8 you're flirting with disaster if you don't also replace/upgrade the radiator.

Back to the world of computers as engines it is possible to get by for a time "running in the red zone" but know that you are depending mostly on luck. 60C = 140F which is the human threshold of pain. Most will pull quickly away by 130F but just about everyone but severely disturbed masochists will not immediately draw back from anything over 60C/140F. Certainly silicon doesn't feel pain and can happily operate above that level but consider there is a reason that Crays are cooled with liquid nitrogen and most large server farms are liquid cooled. In Ham Radio it is an old cliche that "a dollar's worth of antenna is worth 10 dollars of amplifier" and likewise mission critical computing recognizes that a dollar in cooling is worth much more in hardware not to mention the cost of inaccuracy or downtime. This is especially true if any mechanical devices still exist in your PC such as hard drives and optical drives.

The very best fans one can buy generally cost under $40 USD and very good smaller ones can cost as low as 10 bucks. They draw very little current since electric motors are one of the most efficient engines in existence. The cost/benefit ratio is extremely favorable. Professional Gamblers aren't really gamblers. They're just really good at risk assessment. True Gamblers are more often than not Losers. You might be lucky but just know that if you have some agenda that doesn't include risk reduction you're playing against The House.

Knowing the risks and planning accordingly is never unwise.

the3dfxdude 08-08-2017 06:05 PM

Quote:

Originally Posted by enorbet (Post 5746119)
Back to the world of computers as engines it is possible to get by for a time "running in the red zone" but know that you are depending mostly on luck. 60C = 140F which is the human threshold of pain.

Actually I'm quite aware of the limitations of ICs, having working with thermal analysis there firsthand. From a consumer level, knowing the power dissipation required (usually can be found on performance parts like video cards) is a good idea when building your own. However, as a consumer you don't manufacture the card (at least I don't do that). So some amount of trust needs to be developed. For somebody that doesn't know what that mean when it comes to cooling requirements do a little comparison. Pick out a device, find some card manufacturers, and compare their heat sinks / fans. You'll know relatively who's short-changing you with an ill-equipped part. I've personally never had an issue picking a good one.

Now on the other hand, I have seen one Nvidia device once (15+ years ago), that some one had issues with their machine freezing in games. Passive cooling was probably more common on the real cheap stuff back then. Well, I took a look at the card and found about the oddest heat sinks I'd ever seen. So because that seemed so suspect, I put a fan on it, ran the game, and then no more problem...

Richard Cranium 08-08-2017 06:11 PM

Quote:

Originally Posted by the3dfxdude (Post 5746188)
Now on the other hand, I have seen one Nvidia device once (15+ years ago), that some one had issues with their machine freezing in games. Passive cooling was probably more common on the real cheap stuff back then. Well, I took a look at the card and found about the oddest heat sinks I'd ever seen. So because that seemed so suspect, I put a fan on it, ran the game, and then no more problem...

I had an SLI setup a couple of years ago. While playing a fairly heavy in the 3D department game, the game would slow down drastically after about 5 minutes of play. Turns out that one of the two cards had a sticky fan. Normal use wouldn't overheat the card, but that game would. One card would shut down due to overheating and the other one really couldn't handle the game's requirements by itself.

the3dfxdude 08-08-2017 06:38 PM

Quote:

Originally Posted by Richard Cranium (Post 5746190)
I had an SLI setup a couple of years ago. While playing a fairly heavy in the 3D department game, the game would slow down drastically after about 5 minutes of play. Turns out that one of the two cards had a sticky fan. Normal use wouldn't overheat the card, but that game would. One card would shut down due to overheating and the other one really couldn't handle the game's requirements by itself.

Yeah that's why I go the passive route. Your card was spec'd that it relied on active cooling for the higher clock speeds. And then you got to keep them clean. I go for the really big heat sink, and case fans if needed. Easier to clean, easier to replace. Good ones stay pretty quiet. And usually the heat sink is spec'd well since that's how they tested it. I'd probably have to go water cooling to keep myself happy if I ever build a "muscle" computer. But I'm not really interested in builds like that.

upnort 08-08-2017 09:54 PM

Everybody here has been a huge help. An enjoyable thread! I learned a bit too, which is always nice. :)

I think I decided on a board:

ASUS Z170-K LGA 1151.

Has 2 PS/2 ports, 6 SATA III ports, 2 standard PCI slots, and DVI-D, HDMI, VGA. Yeah, I forgot in my original post that I have a PCI capture card. That PCI slot requirement eliminated many boards.

For another $20 I could go with an i5-6500, but I am going with an i5-6400 Skylake 2.7 GHz with Intel HD Graphics 530. Might seem somewhat "low end" spec wise, but will be oodles faster than the current rig and will be nicer on the electric bill.

I am increasing RAM to 16 GB in case I start running more than two VMs concurrently. The quad core will help with that too.

The budget AMD APUs were tempting. After more research I got the feeling I would prefer more CPU muscle for VMs and compiling.

Curiously, the WD hard drive in my current system is SATA III but the mobo is SATA II. I will see a faster hard drive response too. :D (SSDs are a topic for another thread!)

Richard Cranium 08-08-2017 10:40 PM

Quote:

Originally Posted by upnort (Post 5746227)
Curiously, the WD hard drive in my current system is SATA III but the mobo is SATA II. I will see a faster hard drive response too. :D

Well, the head seek time won't change. You'll see faster data transfer rates for sure, but initial reads shouldn't be that different.

Hey, if I'm lying, you'll be happy. :)

upnort 08-08-2017 11:22 PM

Quote:

Hey, if I'm lying, you'll be happy.
I remember when I had SATA II drives on SATA I ports and updated to a SATA II motherboard. Dramatic increase, no, but still noticeable, especially with file transfers between drives. :)

cwizardone 08-09-2017 03:33 AM

Quote:

Originally Posted by upnort (Post 5746227)
....I think I decided on a board:

ASUS Z170-K LGA 1151.......

Very good choice!
:)
Anything to be gained (or lost) by going with the newer
Intel 270 chip set?

upnort 08-09-2017 10:03 AM

Quote:

Anything to be gained (or lost) by going with the newer Intel 270 chip set?
About $80 higher price and too new (cutting edge)? :)

upnort 09-26-2017 11:23 PM

I finally got my new ASUS Z170-K motherboard installed. :D

The previous ASUS M3N78-EM motherboard is now in the LAN server. For some oddball reason, network traffic to client systems now seems snappier from the server. Same drives, same 1 Gbps. Another 4 GB of RAM, so perhaps more disk caching is the difference.

The SATA III drives now finally run at full SATA III speeds. I'm running the EFI in legacy mode. I have no motivation to reformat the hard drives with an EFI partition. :)

The new on-board HD Graphics 530 is much faster than the previous on-board NVidia 8300. Having two more cores in the CPU is nice. Should make a noticeable change with VMs.

A few things I have not resolved.

1) I seem unable to get any fan or temperature sensor info. The UEFI seems to already control fan speed, thus I don't know that I am missing anything other than not having a nice conky display. Any ideas how to get lm-sensors and pwmconfig working on this motherboard? (After surfing the web for info I tried loading the nct6775 module. No change in the sensors-detect output.) Is this a problem with 14.2 and the 4.4 kernel? I tried a recent LiveSlak ISO with a 4.9 kernel and fared no better.

sensors command output:
Code:

asus-isa-0000
Adapter: ISA adapter
cpu_fan:        0 RPM

coretemp-isa-0000
Adapter: ISA adapter
Physical id 0:  +26.0 C  (high = +84.0 C, crit = +100.0 C)
Core 0:        +24.0 C  (high = +84.0 C, crit = +100.0 C)
Core 1:        +24.0 C  (high = +84.0 C, crit = +100.0 C)
Core 2:        +21.0 C  (high = +84.0 C, crit = +100.0 C)
Core 3:        +21.0 C  (high = +84.0 C, crit = +100.0 C)

The coretemp module is loaded automatically on boot.

2) Not a bug or nuisance, but I do not see any Tux logos when I boot. Same hard drives. Same Slackware 14.2 64-bit. Would have been nice to see 4 Tuxes. :)

3) Is there a way to save and restore the UEFI config? flashrom? I haven't thought about this kind of thing in years, but the CMOS battery that came with motherboard failed me twice. Replaced, but the loss of configs meant doing everything from scratch. I did not notice anything in the UEFI interface to save/restore, but I can't see the end of my nose either.

Otherwise I am quite happy with the nex board. Thanks to all for the help!

Didier Spaier 09-27-2017 02:55 AM

Quote:

Originally Posted by upnort (Post 5763395)
2) Not a bug or nuisance, but I do not see any Tux logos when I boot. Same hard drives. Same Slackware 14.2 64-bit. Would have been nice to see 4 Tuxes. :)

A framebuffer is needed to draw the small animals. If you have "vga = normal" in /etc/lilo.conf that would explain why you don't see them.

Oh, and to be picky: s/UEFI/firmware/. Your firmware is UEFI-able, but you are using it in Legacy aka BIOS mode using its compatibility support module aka CSM.

upnort 09-27-2017 10:03 AM

Regarding the sensors, I discovered the nct6775 module does work, but the acpi_enforce_resources=lax boot option is needed. I haven't decided whether tinkering with fan speeds is worth the bother -- the system is silent and the Asus Q-Fan feature seems to handle fan speeds almost exactly like pwmconfig.

Quote:

A framebuffer is needed to draw the small animals. If you have "vga = normal" in /etc/lilo.conf that would explain why you don't see them.
No such option in my GRUB config. That I don't see the critters is just a curiousity and not a show stopper. As I shared, same drive, just moved to a new motherboard. I run with logo.nologo, but with this new system I wanted to see four Tuxes at least once and had temporarily disabled that boot option. I wanted to say, with a British accent, "There are FOUR Tuxes!" :D

upnort 10-21-2017 12:12 AM

For the three people who might be interested, or less, a summary of installing and configuring the new motherboard.


All times are GMT -5. The time now is 10:21 PM.