Linux - HardwareThis forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have a PC with 2 x 80GB and 2 x 320GB SATA2 drives. I have to configure them with RAID1, via the onboard Adaptec controller, the 2 x 80GB preferably with the system software and the 320GBs for the shared network/user folders.
I confugered RAID with Adaptep through the BIOS, which indicated 2 x Logical drives. When I let Suse 10.2 deside what to do with the partition setup, it only uses the 1st 320GB drive. If I try to manualy greate the partitions it does not work, "NO OS found".
Can anyone please advise me on how to configure the RAID partition during the Setup?
To my understanding the current linux kernel does not support motherboard raid setups.
This is a major malfunction as far as I am concerned.
Although motherboard raid is most often driver based software raid, this is a core hardware functionality used by lots of people.
It also provides more reliability than what is achieved through the Linux OS based software raid. If the primary drive fails in a software raid setup your machine will not boot or run. If it fails in a drivers based software raid setup you can continue to boot and run. IMHO this is the whole reason for raid. to keep the machine running if the hard drive fails.
Does anyone know if any of the Linux flavours support the driver based motherboard raids?
Distribution: Slackware / Debian / *Ubuntu / Opensuse / Solaris uname: Brian Cooney
Posts: 503
Rep:
Quote:
If the primary drive fails in a software raid setup your machine will not boot or run
Not true. The tick is that you need to have lilo set up on BOTH drives so if the primary fails, LILO on the secondary will boot the secondary drive.
Linux Kernel raid works quite well, if you take the time to set it up correctly. If you dont want to spend time reading online documentation however, and want it to "just work," your milage might not be quite as good, but its not really that hard to set up.
Interesting I didn't think you could setup Lilo on both drives, Is this done at installation time or later? Can you point me to a link on where to go through an installation like this?
As a side note, kernel 2.4 supported some Motherboard raid setups, kernel 2.6 does not. version 10.2 of SuSE has the 2.6 kernel.
Does anyone know why 2.6 dropped support for motherboard raid?
I have an nvidia raid controller on my motherboard. The raid1 drivers load in Fedora Core 7 automatically. Also, I have a Highpoint Rocket Raid 1640 Raid 5 controller and I use Raid 5 on FreeBSD 6.2. It is supported on Suse version 10.1 and they also have an open source driver you can compile, though I haven't tried it. Suse is seeing both of your drives because you need a driver that provides raid transparency.
You have to make sure the vendor supports your controller, and highpoint sells controllers with Suse drivers, although Redhat seems to be better supported. You can also go to 3ware and purchase a card for about $300 that provides full raid transparency to the OS and doesn't need special drivers. It supports raid 0,1,5 and others.
FreeBSD 6.2 Unix will support raid 5 very well with the Rocket Raid 1640 card (about $90). Freebsd in my experience is very reliable, fast, and easy to support. The downside is getting flash and Vmware to work.
Their is a big difference between software raid and controllers that still require drivers. Using Windows 2003 x64 and raid 5 with my Nvidia on board controller, when I transfer large amounts of data, the cpu isn't using more than 1%. Try doing the same with pure software raid in the os. I can also pull out one of the 3 drives and the system functions normally.
If Fedora Core does see the raid array correctly, it will present both drives, and a third one called something like hptxxxxx if I remember correctly. You need to set up partitions on the 3rd drive, not the two physical drives.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.