LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Debian
User Name
Password
Debian This forum is for the discussion of Debian Linux.

Notices


Reply
  Search this Thread
Old 01-11-2010, 03:09 PM   #1
nbn792
Member
 
Registered: Dec 2003
Distribution: Debian
Posts: 109

Rep: Reputation: 15
Question debian HPC hardware nvidia gpu


I am trying to increase my parallel computational abilities for some dynamic systems mathematical research. I am deciding whether to purchase one of nvidia's gpu cards for my intel CoreQuad, or enlarge my 5 node cluster of mixed debian linux boxes with some headless nodes.

My main concern is how the GPU will appear to the system. Will the system be able to fork processes natively to the GPU? Has anyone successfully ran an nvidia gpu as a computational aid with debian?

I have used debian for years and am reluctant to switch to any other distro, especially redhat which seems to be supported by nvidia, blah, but with my research taking the path it is, my computing system needs the maximum number of simultaneous processors available and the gpu represents a good alternative if it can work under debian.

Thanks.
 
Old 01-11-2010, 10:47 PM   #2
neonsignal
Senior Member
 
Registered: Jan 2005
Location: Melbourne, Australia
Distribution: Debian Bookworm (Fluxbox WM)
Posts: 1,391
Blog Entries: 54

Rep: Reputation: 360Reputation: 360Reputation: 360Reputation: 360
Currently the Nvidia GPUs are not directly supported by any operating system (in the sense of using them as general purpose processing units), and it would be difficult to do so.

One makes use of the CUDA development system to compile code that will make use of the graphics processor array. It is a cheap way of getting lots of processing power (for certain classes of applications), but the applications have to be explicitly developed to use them.

Last edited by neonsignal; 01-12-2010 at 08:08 AM.
 
Old 01-12-2010, 07:39 AM   #3
nbn792
Member
 
Registered: Dec 2003
Distribution: Debian
Posts: 109

Original Poster
Rep: Reputation: 15
thanks for the reply. I was aware of the CUDA C extensions, and this is acceptable to me as my programs are simple and easily portable at this moment... that is why it is important for me to get my hardware situated before I commit to a language and start developing more advanced programs. I see cuda has ext for both FFT and LAS which is awesome. My main concern is if I go buy a CUDA compliant geforce card and slap it in my debian quadcore box, how involved is the setup. I have seen only one instance of an attempt at this on debian in my googling, and it seemed non-trivial.

I guess what I am asking is, are there any hpc debian users successfully using GPU for their calculations? How is it working, and would you do it again? Is the performance exceding the same money put towards mobo+cpu nodes for extremely parallel computations. An example of one of my problems is calculating the norm between all possible pairs of points in a LONG list. Simple enough, but takes FOREVER.

Also, if I get an nvidia geforce card with cuda extensions, can I use it for graphics as well, or will it need to be dedicated for calc?

sorry for all the questions, but I have limited funding, and don't know anyone in a HPC field.

thanks again.
 
Old 01-12-2010, 08:46 AM   #4
neonsignal
Senior Member
 
Registered: Jan 2005
Location: Melbourne, Australia
Distribution: Debian Bookworm (Fluxbox WM)
Posts: 1,391
Blog Entries: 54

Rep: Reputation: 360Reputation: 360Reputation: 360Reputation: 360
There is a howto for building CUDA under Debian Lenny.

The use of general purpose GPUs is still fairly new, so it is going to involve significantly more work fitting your application to them (making the code non-portable). The benefit will depend very much on the application; the shader pipelines mean that you can have significant parallelism, but only where the data sets can be kept small and processed independently. For example, the Stanford 'Folding at Home' project is achieving around 70 times the performance from a typical graphics card than they are from a typical CPU. But they have already had two iterations of software to achieve that performance.

If your constraint is money rather than time, you could consider getting an older GeForce 8 series card to evaluate the use of CUDA before committing to one of the expensive cards.
 
Old 01-12-2010, 09:39 AM   #5
nbn792
Member
 
Registered: Dec 2003
Distribution: Debian
Posts: 109

Original Poster
Rep: Reputation: 15
yes, I agree, I am currently looking at a few sub100$ cards that *could* represent significant improvement in parallelism (if they work) to test the water. Maybe after my university requirements are over I will have the time to utilize one of nvidias Tesla devices

I am concerned that the current CUDA systems do not seem to be able to run disjoint kernels? This will represent a concerning coding consideration for me. My current setup typically forks sub-kernels strapped with my "tools" to available cpus on my network, sends a object and combines resulting objects. This has represented the simplest way to utilize all available processing power for homomorpic problem sets. However, if my objects are of varying complexity, the card would often spend much of its time waiting, and I would have to construct a loop to batch jobs of similar estimated time together, wait, batch, wait, batch. hmmmmm

Secondary projects of mine involve mapping the same function over many initial conditions of same complexity without any recombination and seem easily portable to GPU environment.

Thanks much for the wonderful responses neon. May I ask how employment in the HPC community is? I am a mathematician trying to decide what fields to enter...
 
Old 01-12-2010, 12:52 PM   #6
nbn792
Member
 
Registered: Dec 2003
Distribution: Debian
Posts: 109

Original Poster
Rep: Reputation: 15
just bought a geforce gt220...installing...
 
Old 01-12-2010, 01:35 PM   #7
Quakeboy02
Senior Member
 
Registered: Nov 2006
Distribution: Debian Linux 11 (Bullseye)
Posts: 3,407

Rep: Reputation: 141Reputation: 141
Have you considered looking into the BOINC system? I believe that multiple BOINC projects are able to run at the same time, but I could be wrong.
 
Old 01-13-2010, 08:59 AM   #8
nbn792
Member
 
Registered: Dec 2003
Distribution: Debian
Posts: 109

Original Poster
Rep: Reputation: 15
I have seen the projects like Seti and Folding@home. I don't think that is the direction I am heading.

Well install is not going that well. I am getting the dreaded x-freeze in combination with the nvidia proprietary drivers on a previously rock solid debian lenny box, intel q6600 and foxconn G33M03 mobo. It freezes anytime I try to play a video. OpenGL works fast as lightening though! Still working on it... apparently this is common, so I just have to research/woodshed.

I will report back any solutions, to help document. I hear this is a problem with other distros too, so hopefully I can fix it on debian!

Thanks
 
Old 01-14-2010, 07:59 AM   #9
nbn792
Member
 
Registered: Dec 2003
Distribution: Debian
Posts: 109

Original Poster
Rep: Reputation: 15
Some notes for others that are interested in this sort of thing on debian.

1. It seems that the regular NVIDIA graphics driver for my hardware (Geforce GT220) supports CUDA and is a newer version than the driver on the cuda page. The install went well with no errors, system detects presence of CUDA device, and I can currently run some of the demo code.

2. **PROBLEM** Random freezes, and consistently freezes when playing video. These freezes cause the mouse pointer to remain active, but the computer does not recognize any keyboard or mouse button input, even ctrl-alt-_______ sequences do not work. Computer must hard power off, and reboot.

I tried everything software possible to fix this under debian, and later under windows. Tried newer BIOS, updated all hardware drivers, rolled back drivers, etc. So I dual booted into vista on the same pc and guess what? After I installed the nvidia drivers, the computer screen went blank and my fans quieted, frozen again, now under windows. So I opened up the case, cleaned it up a bit and reseated everything. Same problem. Searching, I found someone at nvidia gave a hint to a disgruntled customer to remove memory down to one stick, install the card, and replace memory. Well this worked with one stick of RAM, and did not work once I replaced all the RAM... so the best solution I found so far:

**SOLUTION** I had 4 sticks of memory occupying four slots. 2x 1gig, 2x 512MB. I removed one 512mb stick. No problems since? I do not understand what caused this and am dissapointed about the situation, but for those experiencing lockups with nvidia hardware, try fiddling with your memory. My computer memory is stock and not overclocked etc, so I would expect the nvidia card to work out of the box.... this also fixed the problem under windows, so it is definitely a hardware issue and not a reflection on debian.

so far as I can tell, even though debian doesnt fully support the CUDA products, I had no software issues, and can recommend this setup to others. I was very impressed by the computational prowlness in some of the example codes, particularly the ocean surface simulation.

Thanks for the responses!
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
tv resolution problem with nvidia gpu furrytonic Linux - Hardware 1 06-11-2009 02:44 PM
nvidia gpu with and ati chip set?? immolatus Linux - Hardware 2 10-27-2008 06:21 PM
How can I scan my hardware to know what it is? (I want info on my CPU, GPU and RAM) brynjarh Linux - Hardware 6 02-28-2006 01:54 AM
gkrellm and nvidia gpu temperature nemesi Slackware 1 03-22-2005 05:56 AM
X Hangs on nvidia GPU Slack 9.1 Aman9090 Slackware 3 03-04-2004 09:16 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Debian

All times are GMT -5. The time now is 03:32 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration