LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Hardware
User Name
Password
Linux - Hardware This forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?

Notices


Reply
  Search this Thread
Old 12-14-2008, 07:57 PM   #16
salasi
Senior Member
 
Registered: Jul 2007
Location: Directly above centre of the earth, UK
Distribution: SuSE, plus some hopping
Posts: 4,070

Rep: Reputation: 897Reputation: 897Reputation: 897Reputation: 897Reputation: 897Reputation: 897Reputation: 897

This isn't the most difficult system in the world to cool, so you can probably do a few things "wrong" and still get a good result.

However, you should note that some of the video cards (looking at a review, specifically the Radeon HD 2400 Pro, although it sometimes varies from manufacturer to manufacturer) are passively cooled, where the 'higher' cards in that series have a fan. With a passively cooled card it is important to have some airflow locally, where that isn't such a big problem with cards which have their own fan - just make sure that the fan system sucks air from somewhere cool.

So, while there is a lot of stuff in this thread that is contentious (let's say well meant, but either only partially understood or badly expressed) if you get even a moderate amount of air flowing through this case, you'll be all right.
 
Old 12-14-2008, 08:25 PM   #17
Quakeboy02
Senior Member
 
Registered: Nov 2006
Distribution: Debian Linux 11 (Bullseye)
Posts: 3,407

Rep: Reputation: 141Reputation: 141
Were you having problems before you added this new stuff? If not, then it's unlikely you'll have problems going forward, because you're not really adding that much. Don't over-think this stuff. You don't have to have a machine that sounds like a jet engine on afterburners in order to have one that is reliable.
 
Old 12-14-2008, 09:23 PM   #18
Quakeboy02
Senior Member
 
Registered: Nov 2006
Distribution: Debian Linux 11 (Bullseye)
Posts: 3,407

Rep: Reputation: 141Reputation: 141
I thought I'd try to quantify this, and did some looking around on the internet. I found a site that seems to give the answer: http://www.comairrotron.com/engineering_notes_02.asp

The formula that I think applies is this one:

CFM = 3.16 x Watts / DT (°F)

If we say that we have 350 Watts of heat and we only want to allow the temperature in the case to rise by 10 degrees (i.e. from a room temperature of 72 degrees to a case ambient temp of 82 degrees F) then we would use the following in the formula:

3.16 * 350 / 10 gives us a need of about 110 CFM of airflow.

If we can stand up to a 20 degree temperature rise (we probably can) then the airflow drops to about 55 CFM.

3.16 * 350 / 20 gives us a need of about 55 CFM of airflow.


Looking at the various fans, we see that noise is pretty much proportional to fan speed and inversely proportional to fan size. So, if I were in this situation and was worried about getting the "best" fan, I'd first find out how much air the fan in my PSU moved. If I still needed more airflow I'd add a case fan rated between 55CFM and 110CFM minus what the PSU fan moved. IOW, if the PSU fan moves 40CFM, then I'd need a case fan of between 15CFM and 70CFM.

This is rather simplistic, though, because if I put an 80CFM fan in, it's likely to overwhelm the PSU fan to some extent. In this case, 80CFM of total airflow would be suitable, so I'd get a 40CFM case fan to match the PSU fan.
 
Old 12-14-2008, 10:19 PM   #19
lazlow
Senior Member
 
Registered: Jan 2006
Posts: 4,363

Rep: Reputation: 172Reputation: 172
So are you saying 40CFM in or 40CFM out?
 
Old 12-15-2008, 02:21 AM   #20
pixellany
LQ Veteran
 
Registered: Nov 2005
Location: Annapolis, MD
Distribution: Mint
Posts: 17,809

Rep: Reputation: 743Reputation: 743Reputation: 743Reputation: 743Reputation: 743Reputation: 743Reputation: 743
Quote:
Originally Posted by lazlow View Post
So are you saying 40CFM in or 40CFM out?
The airflow into the case has to be the same as the airflow out----but maybe that's not what you're asking.

The issue I see with multiple fan calculations is that the rating of a fan applies only at a particular pressure drop. Suppose I have an enclosure that is quite restricted: If I put a "40CFM" fan at the input, I might get only 30CFM of flow. But if I put another identical fan at the output, the flow would be higher.

By similar logic, adding more output fans does not automatically increase the airflow in proportion. With a restricted input, adding more output fans would--in the limit--do little except make more noise.
 
Old 12-15-2008, 02:27 AM   #21
lazlow
Senior Member
 
Registered: Jan 2006
Posts: 4,363

Rep: Reputation: 172Reputation: 172
Yep, that is what I was checking to make sure what Quake meant. Too many times I see people setting fans as exhaust only and wonder why they have a hot case. I will stick by my guns about wanting slightly more input than output, for the reasons stated earlier in the thread.
 
Old 12-15-2008, 07:40 AM   #22
salasi
Senior Member
 
Registered: Jul 2007
Location: Directly above centre of the earth, UK
Distribution: SuSE, plus some hopping
Posts: 4,070

Rep: Reputation: 897Reputation: 897Reputation: 897Reputation: 897Reputation: 897Reputation: 897Reputation: 897
Quote:
Originally Posted by lazlow View Post
Generally you want more air (CFM) of fresh air coming in than hot air being exhausted.
OK, I'm going to bite (a little):

I don't think in saying that you want more air coming in than exhausted you mean that you want the case to somehow absorb air; given that air will tend to expand as it heats up, it seems that you meant that the air should be exhausted under pressure. Could you please explain that more, because it makes only a limited amount of sense to me; increasing the pressure in the case could, theoretically, be constructive, but the decrease in airflow probably causes more loss than the gain from the increase in pressure?

Or, were you making the mistake that pixellany described and assuming that fans actually move their rated number of CFM. Even ignoring the optimism of fan manufacturers (which isn't negligible), that's the rating at one level of back pressure and that's almost certainly not the level of back pressure that you have.

pixellany wrote:
Quote:
First, good cooling means maximum (volume) airflow.
It would have been better to have written 'mass of air' rather than 'volume'; had you done that, you could probably have ignored, to a first order, the issues of the pressure of the air. Of course, it is also an advantage to use cool air, rather than hot, for cooling things...

H_TeXMeX_H wrote:
Quote:
Another example of why a large vent is bad: If you take the side panel of the case off, your computer will likely overheat. This suggests that by doing this you destroy the difference in pressure as well as the airflow.
and got the reply (jiml8)
Quote:
Disruption of optimal airflow is by itself probably sufficient to explain the difference.
The H_TeXMeX_H comment about the side panel is clearly a bit puzzling; if you take the side panel off a 'pressurised' case, the air spills out of the side (bad, unless the sources of heat are unusually early in the heat path) and with a 'partial vacuum' case air gets sucked inwards. I was unclear whether this is what jim meant, but it does mean that there is no simple 'this means that' picture that is being demonstrated here.

Additionally, it is worth noting at this point that you can easily blow a jet of air; sucking a jet of air is trickier. In this circumstance, you could be blowing a jet of air at a part that particularly needed it.

Quote:
Originally Posted by lazlow View Post
...more air coming in than hot air being exhausted. The reason for this is rather simple. If you draw more air out through the exhaust than the intake is providing you can generate (with modern PSUs) a situation where you are drawing hot air in through the PSU. This situation will cause problems across multiple fronts. First it will disrupt your air flow, which lowers the cases ability to remove heat from the components that need cooling. Second you are effectively heating up the case.
Are you suggesting that you can actually reverse the direction of airflow through the power supply? Apart from the passively cooled power supplies (which can be a big problem in other ways, if applied in the wrong situation) I haven't seen the airflow actually reverse, even if it may well make the psu fan work a little harder than necessary (which while not ideal, doesn't seem to cause a real problem, provided that it still keeps a reasonable amount of air flowing in the right direction).

To try to get away from 'theory' and back to stuff that might conceivably be useful to the OP, let me try to give some useful pragmatic advice.
  • Build it and see. If you have any heat problems or hot spots, deal with it.
  • It is easier, of course, if you have a way of measuring temperatures; optical pyrometers are really quite handy, but, failing that, you are reduced to reading temperatures in eg, the bios. This doesn't give you coverage of very many points.
  • The video card(s) that you seem to have selected is/are interesting choice(s); they were not the highest performance cards when introduced and that was ~18 months ago, so it is unclear what they would do for you apart from allow you the option of playing old games (if this is exactly what you want, though...) Assuming that you have some very cheap source of the card, at least they don't use as much power/dissipate as much heat as some of the really high performance ones.
  • I'm assuming that you have the standard AMD cpu fan; this isn't a very good thing. Its not the best performing fan thermally (its OK-ish, and suitable for undemanding applications, but not the best) and the noise it makes is a bit raw and irritating. If either the noise or the temperatures are a bit irritating, a third party option could be better. It depends a bit on how concerned you are about noise and how succesful the case is at suppresing it.
  • You seem to have a large collection of small drives in the case; some of these are PATA (at least one, and probably the optical drive too) and PATA cables are much more obstructive to airflow than SATA ones. There is, therefore, a good argument to be made for taking some care to ensure that these are dressed in such a way that they do not obstruct the airflow too much.
 
Old 12-15-2008, 11:40 AM   #23
jiml8
Senior Member
 
Registered: Sep 2003
Posts: 3,171

Rep: Reputation: 116Reputation: 116
Proper thermal design of a computer system is not a trivial undertaking though it is far simpler in the home computer environment than it is in, for instance, an avionics box.

Prebuilt computers from reputable manufacturers such as Dell will have undergone significant thermal design and if you take the covers off and look at it, you will see this. Airflow is matched to component placement in such a manner that minimal fans are needed and the airflow will cross the necessary parts of the internal system, and the system will be very quiet.

A "build it yourself" system has essentially NO thermal design incorporated, and everything is ad-hoc and brute force. This is inevitable because the build it yourself system will have components placed in it willy-nilly, and the layout of those components can't be well known in advance. The manufacturers of the better cases used for build it yourself will definitely do their best to optimize airflow, but they can only do so much.

My workstation is housed in an Antec P182 case. This case is well done from a thermal standpoint. The power supply exists in its own duct isolated from the rest of the electronics and air is pushed past it by a (possibly redundant) fan that drives the duct and pulled past it by its own fan. I have no problems keeping the PS cool in this setup and have turned the PS fan way down as a result.

The inlet air area is all fan driven on this case (the purchase product didn't use fans, but I added them as an option because I was putting a LOT of stuff in the box). I have a 120mm fan driving one inlet, and a hard drive cooler (with three little fans on it) occupying a 5.25" drive bay with my 3.5" system drive installed in it. I am running the 125mm inlet fan wide open and even so the system is running at a slightly negative pressure.

The exhaust area is driven by 2 120mm fans, one on the top at the rear of the case, the other on the back at the top of the case. This seems to work well. I have one of these fans set to "medium" speed and one to "low" speed (3 speed fans).

I also have no ribbon cables in the box; I replaced all IDE cables with round cables, and my SCSI cables are all round cables.

I have internal fans on the processor, the video card, and the northbridge chip. I also installed a vantec "fan on a stalk" to reach a dead air space on the motherboard and provide air motion around it.

This system has an Asus motherboard with an overclocked Athlon XP2700 processor, 2 Gigs RAM, 1 DVD player, 1 DVD burner, 1 Zip drive, 5 SCSI hard drives, an Adaptec SCSI controller, 2 NICs, a modem, an nVidia 7800GS AGP video card, and a TV card in it. The Athlon wants to run very hot, and overclocking it just makes it want to be hotter.

However, with this setup, in a 72F (23C) room, the Athlon when fully busy runs at 47C, the video card at 44C and the northbridge at 35C.

I can raise all those temps about 2-3C by turning off the fan on a stalk. I can reduce all those temps by about 2-3C by turning the exhust fans on full, but only at the price of noise.

My system drive, a Fujitsu 147 Gig SCSI drive which has the three little fans blowing across it, consistently reports its temperature as 25-26C. The other drives in the box all report themselves as about 33-38C.

This system makes a moderate shhhhhh noise of air moving, but the major noise out of it is because of one very loud hard drive that is still in it (a Seagate Barracude 50 Gig SCSI drive).

The point is that I have gone here with a combination of decent engineering (done by Antec) and brute force (done by me) to keep the system cool. I have succeeded, but I have a lot of fans.

If you want the system to be cool without water cooling, you have to move air across the components that get hot. This means adequate ventilation area on both inlets and outlets, and an internal design the either minimizes obstructions or provides supplementary means to get airflow to obstructed areas.
 
Old 12-15-2008, 01:02 PM   #24
Quakeboy02
Senior Member
 
Registered: Nov 2006
Distribution: Debian Linux 11 (Bullseye)
Posts: 3,407

Rep: Reputation: 141Reputation: 141
I have to tell you guys that reading this thread would make a person think twice about building their own system. There is simply too much FUD being peddled in this thread. Put the motherboard in, put a fan on the case, and go. It really isn't any harder than that. But, to those who say it is, I challenge you to show me articles about computer that have died due to poor case ventilation. There should be loads of them. There aren't. It's really not that complicated guys!
 
Old 12-15-2008, 02:04 PM   #25
jiml8
Senior Member
 
Registered: Sep 2003
Posts: 3,171

Rep: Reputation: 116Reputation: 116
Quote:
Originally Posted by Quakeboy02 View Post
I have to tell you guys that reading this thread would make a person think twice about building their own system. There is simply too much FUD being peddled in this thread. Put the motherboard in, put a fan on the case, and go. It really isn't any harder than that. But, to those who say it is, I challenge you to show me articles about computer that have died due to poor case ventilation. There should be loads of them. There aren't. It's really not that complicated guys!
I have had exactly that happen. I diagnosed the cause of failure when I removed the processor chip from the mobo and saw burn marks on the back of it. The problem wasn't a bad fan, it was insufficient ventilation.

After that, I started paying a LOT more attention to thermal characteristics. I had known my mid-tower was crowded but had not realized that the problem was so severe.

From that point on, until I repackaged, I ran the box with the side cover off and a fan sitting a few feet away (I mean a square room fan, about 24" on a side) to blow air into the box.

And I agree. It isn't that complicated. But it isn't trivial either, particularly if the system becomes large. I entered the thread to counter statements implying that significant adiabatic expansion would occur if air was pulled into the box through a small hole, because this just won't happen. The rule is to make sure there is enough ventilation and enough air moving. Do that and you'll be fine.
 
Old 12-15-2008, 02:36 PM   #26
Quakeboy02
Senior Member
 
Registered: Nov 2006
Distribution: Debian Linux 11 (Bullseye)
Posts: 3,407

Rep: Reputation: 141Reputation: 141
Quote:
Originally Posted by jiml8 View Post
I have had exactly that happen. I diagnosed the cause of failure when I removed the processor chip from the mobo and saw burn marks on the back of it. The problem wasn't a bad fan, it was insufficient ventilation.
I understand, and might have made the same diagnosis. But, I will note that there are probably going to be burn marks on any CPU that fails thermally, because the things just put out so much heat these days. It would be interesting to know whether you used the thermal tape that came on the cooler, or if you used something similar to arctic silver.

Athlons run hot and have a physically tiny footprint on the cooler. Once during routine messing about I noticed that the cooling fins (narrow OEM type) were completely clogged on my XP2500+. It never skipped a beat.

Quote:
I entered the thread to counter statements implying that significant adiabatic expansion would occur if air was pulled into the box through a small hole, because this just won't happen. The rule is to make sure there is enough ventilation and enough air moving. Do that and you'll be fine.
This was essentially the cause for me entering this thread, too, and the reason I attempted to quantify the problem. For the average user, I still don't see a problem. I have one "pretty" aluminum case where the drives are too close together (about 3/16"), and I've suffered from one drive failure. Then again, the drive was old, so who knows? With a dual disk, DVD, video board, dual- or even quad-core system it's hard to imagine getting into problems unless you disconnect all the fans just to see it happen.
 
Old 12-15-2008, 02:55 PM   #27
H_TeXMeX_H
LQ Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
Blog Entries: 2

Rep: Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301
@ Quakeboy02
I actually agree with you, there's plenty of FUD in this case. All you really need (besides the CPU fan) is a single output fan in the rear, if you have that, you're almost guaranteed not to have any heat issues. This, of course, depends on the size of the case, the mobo, the CPU, and GPU. I can imagine some "uber" combinations that would cause overheat. Take a tiny case with no vents, put in 2 of the latest SLI linked nvidia cards, put in a Phenom X4 9950 (or even better an Itanium 9152M), add a few SCSI HDDs, and have the PSU pump heat down into the case (like mine does) ...
 
Old 12-15-2008, 03:12 PM   #28
pixellany
LQ Veteran
 
Registered: Nov 2005
Location: Annapolis, MD
Distribution: Mint
Posts: 17,809

Rep: Reputation: 743Reputation: 743Reputation: 743Reputation: 743Reputation: 743Reputation: 743Reputation: 743
Quote:
Originally Posted by Quakeboy02 View Post
I have to tell you guys that reading this thread would make a person think twice about building their own system. There is simply too much FUD being peddled in this thread. Put the motherboard in, put a fan on the case, and go. It really isn't any harder than that. But, to those who say it is, I challenge you to show me articles about computer that have died due to poor case ventilation. There should be loads of them. There aren't. It's really not that complicated guys!
I think "FUD" might be a bit strong.....How about "picking fly specks out of pepper"? The OP started with a pretty detailed question---certainly a lot more that "Install MB, Install fan, hit power", then we obliged with some detailed answers.

"died due to poor ventilation"? Well, not specifically, but I've certainly had the following:
  • Gradual degradation in performance + higher incidence of "gremlins".
    -----Solution: clean everything, check and oil fans
  • Shutdown (1)
    -----Solution: clean and oil the small fan on one of the motherboard chips
  • Shutdown (2)
    -----Solution: Remove and reinstall CPU with new thermal compound.
  • CPU temps too high
    -----Solution: Adjust mounting of CPU cooler fan

Any one of these would eventually lead to failure of some component.

Last edited by pixellany; 12-15-2008 at 03:14 PM.
 
Old 12-15-2008, 03:18 PM   #29
lazlow
Senior Member
 
Registered: Jan 2006
Posts: 4,363

Rep: Reputation: 172Reputation: 172
salasi

Your first question is answered in by the second quote of mine you posted. Note that I said Modern PSUs. My Seasonics' cooling fans(some Antecs that I am aware of too) will slow (stop) their fans as they cool off. If there is negative pressure in the case this will draw air in through the PSU(slow or stopped psu fan). As there is fresh air constantly being draw in (by negative pressure in case) the Psu fans will essentially never kick in(again). As that heat, that would normally be exhausted, is being draw into the case it heats up the case and the incoming airflow(from the Psu) disrupts the "normal" air flow of the case. This can (and has) be demonstrated by hanging a tissue by the internal openings of the Psu and replacing one side panel of the case with a sheet of Plexiglas. If the tissue is sucked up against the PSU this would indicate air flow in the exhaust direction. If the tissue billows up (think a sail) this indicates airflow in the intake direction. This same method can be used in various ways to see what is going on with the airflow in the case (you can also use smoke).

Quake

I suspect that the reason a lot of us are concerned with heat is that we have been running machines for a long time (since 70s for me). We know that heat can kill in two ways. The first is the one that you always hear about, my X got hot and burned up. The second people do not always recognize, heat(not even over spec heat) slowly damages electronics over time. The cooler(within reason) you can keep your equipment, the longer it will last. There is a reason that a lot of us have equipment around that is ancient but still functional. I have a lot of friends who have bought the exact same motherboards and cpus as I have. There are two main differences. First, I watch my temperatures and make certain I get them as low as possible. Second, years after their equipment has failed mine is still running. I generally give up equipment becuase it has become too slow to be of use, not because it has burned up. An example of this would be my 386sx16 which I gave up when I built my 3800x2. I moved my 486 33 into the 386's position and my PIII into the 486's position. I still have 4-40MB (not GB) fully functional drives that were bought (and installed) when the 40MB drives were in the sweet spot (price/size). When you see equipment regularly last longer than its sisters (power from the same power company) there is a reason for it.
 
Old 12-15-2008, 06:29 PM   #30
onebuck
Moderator
 
Registered: Jan 2005
Location: Central Florida 20 minutes from Disney World
Distribution: SlackwareŽ
Posts: 13,925
Blog Entries: 44

Rep: Reputation: 3159Reputation: 3159Reputation: 3159Reputation: 3159Reputation: 3159Reputation: 3159Reputation: 3159Reputation: 3159Reputation: 3159Reputation: 3159Reputation: 3159
Hi,
Quote:
Originally Posted by jiml8 View Post
Proper thermal design of a computer system is not a trivial undertaking though it is far simpler in the home computer environment than it is in, for instance, an avionics box.

Prebuilt computers from reputable manufacturers such as Dell will have undergone significant thermal design and if you take the covers off and look at it, you will see this. Airflow is matched to component placement in such a manner that minimal fans are needed and the airflow will cross the necessary parts of the internal system, and the system will be very quiet.

A "build it yourself" system has essentially NO thermal design incorporated, and everything is ad-hoc and brute force. This is inevitable because the build it yourself system will have components placed in it willy-nilly, and the layout of those components can't be well known in advance. The manufacturers of the better cases used for build it yourself will definitely do their best to optimize airflow, but they can only do so much.

My workstation is housed in an Antec P182 case. This case is well done from a thermal standpoint. The power supply exists in its own duct isolated from the rest of the electronics and air is pushed past it by a (possibly redundant) fan that drives the duct and pulled past it by its own fan. I have no problems keeping the PS cool in this setup and have turned the PS fan way down as a result.<SNIP>

The point is that I have gone here with a combination of decent engineering (done by Antec) and brute force (done by me) to keep the system cool. I have succeeded, but I have a lot of fans.

If you want the system to be cool without water cooling, you have to move air across the components that get hot. This means adequate ventilation area on both inlets and outlets, and an internal design the either minimizes obstructions or provides supplementary means to get airflow to obstructed areas.
Not to get into a 'pissing war' but there are differences between consumer grade equipment and LAB/Industrial grade equipment. Dell is considered a consumer grade equipment except for the enterprise equipment. The system designed for the general consumer is a fair design with minimal hardware therefore margining is a limited design. The hardware placement or mounting is done to minimize the subsystem to the bare then you will know that the case is fairly open both with front vents along with the rear grills. Dell has been known to place vent covers to close the case along with hooding to remove processor heat. Plastic is cheaper than a case redesign.

Enterprise equipment design is totally different than the consumer grade equipment for obvious reasons. Dressing within the cases are different, Dell uses a lot of clips along with mounts to clean the interior case. If you look at the rack style equipment then you would appreciate the design even more. Redundant PSUs, processor racking along with raceways for interconnection really cleans the design.

As for the '"build it yourself" system has essentially NO thermal design incorporated, and everything is ad-hoc and brute force."'. Generally yes, not everyone is an engineer so the design will be weak in some areas but possibly strong in others.
Not everyone is versed in 'CFD' nor thermal design even in the simplest form therefore your brute force statement is broad but applicable.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Power Supply replacement question bigalexe Linux - Hardware 15 03-11-2008 09:49 PM
Uninterruptible Power Supply question jantman Linux - Hardware 9 01-30-2007 12:01 AM
Basic Power Supply question lothario Linux - Hardware 2 12-16-2006 08:21 AM
Power supply fan question wapcaplet Linux - Hardware 2 11-16-2004 05:04 PM
Heat sink, or not to heat sink ? Pres Linux - Hardware 4 07-13-2003 03:49 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Hardware

All times are GMT -5. The time now is 06:09 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration