LinuxQuestions.org
Latest LQ Deal: Complete CCNA, CCNP & Red Hat Certification Training Bundle
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 09-21-2013, 06:09 AM   #1
zatak
LQ Newbie
 
Registered: Sep 2013
Posts: 5

Rep: Reputation: Disabled
laptop,SLI,xorg.conf,intel & nvidia graphics chips


Hey - I'm not exactly a newbie but this may well be regarded as a pretty basic question.

I'm thinking about buying an XMG laptop. It has 2 NVIDIA GPUs. They are connected with SLI.

I want to configure xorg so that SLI is IGNORED. That is, I don't want to use it.

I want to configure it so that the x server only uses one of the GPUs. The other one I want to leave as a device for doing work with CUDA.

So one GPU is for graphics, the other for computation. So the computations don't disrupt/corrupt the display.

this configuration is normally trivial. I do it all the time with workstations I build. But I'm wondering if the fact the 2 GPUS are physically joined with SLI will cause me problems?

on a related note: the motherboard probably has an integrated intel chip. on windows or macos the OS will dynamically switch to the intel graphics when there is no app that needs openGL.

but typically with an xserver, you configure the device for the display and that is that.

is there a way to configure an xserver to switch from the NVIDIA driver and device to the intel?

that way I don't even need a laptop with 2 GPUS. I can just switch to intel graphics when I need to run CUDA code.

The code I run is very specific. I'd prefer the intel/1 NVIDIA GPU setup to the dual GPU setup if it can be done. Maybe there is a way these days.
 
Old 09-21-2013, 08:16 AM   #2
cascade9
Senior Member
 
Registered: Mar 2011
Location: Brisneyland
Distribution: Debian, aptosid
Posts: 3,753

Rep: Reputation: 934Reputation: 934Reputation: 934Reputation: 934Reputation: 934Reputation: 934Reputation: 934Reputation: 934
Why not find a laptop with only 1 nVidia GPU over what will be a more expensive 'gamers' laptop? Or a desktop for CUDA and a laptop without an nVidia GPU for real mobile use. You'd end up with a 'better' laptop for portable use and a more powerful desktop for about the same amount of money.
 
Old 09-21-2013, 08:52 AM   #3
zatak
LQ Newbie
 
Registered: Sep 2013
Posts: 5

Original Poster
Rep: Reputation: Disabled
One GPU is absolutely an option.

but I'm confused by your response. It doesn't really answer either of my questions. Maybe you didn't read to the end. I'll be more succinct:

1) will SLI affect things if I try to use one GPU for X and the other for computation?
2) if I get a laptop with 1 GPU and an intel onboard graphics chip, can you configure X to dynamically switch between them?

I think the answer to 1) is "no, SLI does nothing unless activated in software"
I don't know the answer to 2)

my hardware requirements are different to a gamer, but in many ways higher. I have high RAM,storage and throughput requirements for the proposed device. it needs to be portable but it won't be frequently moved. I have a different laptop for casual use.
 
Old 09-24-2013, 07:34 AM   #4
cascade9
Senior Member
 
Registered: Mar 2011
Location: Brisneyland
Distribution: Debian, aptosid
Posts: 3,753

Rep: Reputation: 934Reputation: 934Reputation: 934Reputation: 934Reputation: 934Reputation: 934Reputation: 934Reputation: 934
Oh, I read it to the end, I just dont get why you would want the extra complexity and expense of SLI when I cant see any need for it.

AFAIK if you try to use CUDA on a SLI setup, only 1 GPU is used, the other just sits idle. Will it affect things? I dont know, I dont use SLI or CUDA myself.

You can dynamically switch between a nVidia GPU + Intel video with 'bumblebee'.

http://bumblebee-project.org/

You should probably check out how bumblebeee works.
 
Old 09-24-2013, 09:21 AM   #5
sniff
Member
 
Registered: Jan 2003
Location: Durham UK
Distribution: openSUSE/Debian
Posts: 328

Rep: Reputation: 42
Quote:
Originally Posted by zatak View Post
1) will SLI affect things if I try to use one GPU for X and the other for computation?
My experience of SLI on Linux is that your instinct is correct. In my desktop, even with the SLI cable in place, I have to enable SLI in the xorg.conf file for it to swtich on. So the normal situation is that I have one GPU for graphs and the other for CUDA, and it works well. Then when I want to play a game I just boot into Windows which has SLI enabled and off I go...

Unless you are doing intensive CUDA work on your laptop then you can do a lot on one GPU without to much trouble. So if its just for development etc then perhaps you are better off with a single GPU laptop.

I don't have experience of using on-board vs add-on GPU in a laptop so I can't help you there. It would be really cool if you could run graphics on the on-board kit and then run CUDA on the add-on GPU. I'm guessing that you can't do that.
 
Old 09-25-2013, 03:32 AM   #6
cascade9
Senior Member
 
Registered: Mar 2011
Location: Brisneyland
Distribution: Debian, aptosid
Posts: 3,753

Rep: Reputation: 934Reputation: 934Reputation: 934Reputation: 934Reputation: 934Reputation: 934Reputation: 934Reputation: 934
Quote:
Originally Posted by sniff View Post
It would be really cool if you could run graphics on the on-board kit and then run CUDA on the add-on GPU. I'm guessing that you can't do that.
Yes, you can, and in some ways its actaully easier than using the nVidia GPU for display.

Quote:
CUDA without Bumblebee

This is not well documented, but you do not need Bumblebee to use CUDA and it may work even on machines where optirun fails. For a guide on how to get it working with the Lenovo IdeaPad Y580 (which uses the GeForce 660M), see: https://wiki.archlinux.org/index.php...80#NVIDIA_Card. Those instructions are very likely to work with other machines.
https://wiki.archlinux.org/index.php...hout_Bumblebee

Quote:
NVIDIA Card

The Y580 uses NVIDIA's Optimus technology, which is not officially supported on Linux yet. A possible solution is to install Bumblebee (https://wiki.archlinux.org/index.php/Bumblebee) and to access the card with optirun. However, you can still use CUDA, which is good if you use apps like Blender or if you develop CUDA C programs.
https://wiki.archlinux.org/index.php...80#NVIDIA_Card
 
Old 09-25-2013, 06:49 AM   #7
sniff
Member
 
Registered: Jan 2003
Location: Durham UK
Distribution: openSUSE/Debian
Posts: 328

Rep: Reputation: 42
Quote:
Originally Posted by cascade9 View Post
Yes, you can, and in some ways its actaully easier than using the nVidia GPU for display.
That would be a really neat solution I think.
 
Old 09-25-2013, 07:35 AM   #8
zatak
LQ Newbie
 
Registered: Sep 2013
Posts: 5

Original Poster
Rep: Reputation: Disabled
well here's the thing:

the CUDA code is code we've developed, yes. For quick stuff I'd probably Matlab CUDA. So I might use CUDA alot.

But at the same time I'd be running GPU intensive visuals (not games, but very graphically demanding programs, for visualising the brain).

I don't think the intel chip can handle those. And I might well run some CUDA code that takes hours or days.

So I think I'm gong with the dual NVIDIA option.

SLI does need to be turned on in software. I'm pretty sure the Xorg.conf from one of my boxes that has a GPU for display and then computation GPU's will be fine. I don't see that it need be different. And I asked the XMG reseller and he seemed confident of that.

still, I may want to use BOTH NIVIDA for CUDA at some point. in which case bublebee is a good option. I don't think the machine planning has optimus but it'll have an intel graphics chip.

thanks,

---
 
Old 09-25-2013, 07:56 AM   #9
cascade9
Senior Member
 
Registered: Mar 2011
Location: Brisneyland
Distribution: Debian, aptosid
Posts: 3,753

Rep: Reputation: 934Reputation: 934Reputation: 934Reputation: 934Reputation: 934Reputation: 934Reputation: 934Reputation: 934
Quote:
Originally Posted by zatak View Post
But at the same time I'd be running GPU intensive visuals (not games, but very graphically demanding programs, for visualising the brain).

I don't think the intel chip can handle those. And I might well run some CUDA code that takes hours or days.

So I think I'm gong with the dual NVIDIA option.
I think you are either overstating graphical requirements, or understating the capabilites of the intel video chip.

They arent as fast as the nVidia GPUs, but if you dont need a framerate of XX for it to be smooth, the lower powered intel video chip should still do the job.
 
Old 09-25-2013, 08:12 AM   #10
zatak
LQ Newbie
 
Registered: Sep 2013
Posts: 5

Original Poster
Rep: Reputation: Disabled
Well, the graphical requirements are actually very substantial. The framerate would drop to less than 1 (sometimes it does anyway if I accidentally render too much data) I'm actually pretty sure I'm making the right move. It's less the nature of the software and more the data it's being used for which is unsual. otherwise I'd agree without you, yes.

I'd rather pay extra and guarantee smooth sailing because it's for work not play. The reason I'm getting this portable device is because it will be moving between labs in the US and UK now and then. It's a specific reason.
 
Old 09-25-2013, 09:50 AM   #11
sniff
Member
 
Registered: Jan 2003
Location: Durham UK
Distribution: openSUSE/Debian
Posts: 328

Rep: Reputation: 42
Quote:
Originally Posted by zatak View Post
Well, the graphical requirements are actually very substantial. The framerate would drop to less than 1 (sometimes it does anyway if I accidentally render too much data) I'm actually pretty sure I'm making the right move. It's less the nature of the software and more the data it's being used for which is unsual. otherwise I'd agree without you, yes.

I'd rather pay extra and guarantee smooth sailing because it's for work not play. The reason I'm getting this portable device is because it will be moving between labs in the US and UK now and then. It's a specific reason.
Ah, simulations..? that is what I use CUDA for, agent-based simulations of cellular differentiation and vascular development. If you need it to actually process data then yeah, go for the two GPU option. That laptop might also be better (in terms of cooling) at running under heavy load for long periods of time than your average laptop. I have cooked a laptop or too in my time.

For me, I don't use laptop for actual crunching (I don't have issues with moving between labs) so I am quiet interested in the potential of using on-board for GUI and Nvidia for CUDA for those odd times where I am developing/demoing on the go. Might look into that as its laptop upgrade time soon.
 
Old 09-25-2013, 10:10 AM   #12
zatak
LQ Newbie
 
Registered: Sep 2013
Posts: 5

Original Poster
Rep: Reputation: Disabled
Well, the CUDA use at the moment is primarily for very rapid non-linear registration of MRI images. But I also use it e.g. for analysing big (10GB) microscope images in matlab. That can take a bit of time.

Most of my data actual processing is done on the cluster, then I work on a machine with a gamer GPU and also a C2075 GPU. I also use a bunch of RAM and storage.

So I want a laptop that can take this load, if it has to. And I think this actually can. 3 years ago they didn't exist.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] Xorg.conf for Intel HD Graphics? turboscrew Linux - Hardware 6 03-24-2013 03:55 PM
PLEASE copy&paste your /etc/X11/xorg.conf file IF you use Intel 965GM &. Balarabay1 Linux - Software 3 10-06-2008 06:33 PM
working xorg.conf file for Intel 845G Graphics dizzi Linux - Hardware 3 11-06-2007 04:18 PM
Help! Prob w FC6 install - xorg.conf & my Intel 82845G graphics card moose22 Fedora 7 03-29-2007 06:11 AM
PROBLEM!-Slackware10.1/Xorg on Dell Laptop video:intel Extreme Graphics 2 for mobile nofx_scotti Slackware 4 03-16-2005 10:46 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 04:56 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration