LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > CentOS
User Name
Password
CentOS This forum is for the discussion of CentOS Linux. Note: This forum does not have any official participation.

Notices


Reply
  Search this Thread
Old 09-19-2018, 03:32 AM   #1
dr0pz
LQ Newbie
 
Registered: Sep 2018
Posts: 2

Rep: Reputation: Disabled
Raid 0 becomes inactive in raid 1+0 when one drive is removed


Hi guys I need help regarding Raid 0 becoming inactive on Raid 1+0 when one drive is removed.
We use centos 6.5 that is booting from the usb and the raid 1+0 is mounted as /data which holds virtualbox images and files.
We format the drives using gparted and created the raid using mdadm.
This is software-raid
so the setup is
/dev/sda and /dev/sdb = /dev/md1 (raid 1)
/dev/sdc and /dev/sdd = /dev/md2 (raid 1)
then we set them up as raid 0
/dev/md1 and /dev/md2 = /dev/md0 (raid 0)

I remember testing this a few years back that when /dev/sda is removed raid 0 is still active and just reports that a drive had failed.
same for sdb, sdc and sdd.
But recently when we tested it again we had found out that raid 0 is becoming inactive when any one of the four drives is removed.
So this is basically a change in behavior. I had tested this with the old kernel version on which we first tested the raid 1+0 and the new version via yum update and the result is the same.

Does anyone know why this happens?

Thanks!
 
Old 09-20-2018, 12:33 PM   #2
smallpond
Senior Member
 
Registered: Feb 2011
Location: Massachusetts, USA
Distribution: Fedora
Posts: 4,140

Rep: Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263
After failing sda please post the outputs of:

Code:
cat /proc/mdstat
mdadm --detail /dev/md1
mdadm --detail /dev/md0
Anyway, why do this instead of a single raid 10 array?
 
Old 09-23-2018, 08:19 PM   #3
dr0pz
LQ Newbie
 
Registered: Sep 2018
Posts: 2

Original Poster
Rep: Reputation: Disabled
Hi that is what we are simulating. We are simulating a failed drive so we just turn off the server then pull one drive out then boot the system.
cat /proc/mdstat with all drives will produce something like
Personalities : [raid1] [raid0]
md125 : active raid0 md127[0] md126[1]
1953262592 blocks super 1.2 512k chunks

md126 : active raid1 sdc[2] sdd[1]
976631360 blocks super 1.2 [2/2] [UU]

md127 : active raid1 sdb[1] sda[2]
976631360 blocks super 1.2 [2/2] [UU]

But when we took one drive out it becomes

Personalities : [raid1] [raid0]
md125 : inactive raid0
1953262592 blocks super 1.2 512k chunks

md126 : active raid1 sdd[1]
976631360 blocks super 1.2 [2/2] [U_]

md127 : active raid1 sdb[1] sda[2]
976631360 blocks super 1.2 [2/2] [UU]

In theory raid 0 should still work as the raid 1 array only lost one drive. and therefore should still work.

We are now trying the single raid 10 array and also found out that it gives the same result.

Thanks!
 
Old 09-23-2018, 08:55 PM   #4
syg00
LQ Veteran
 
Registered: Aug 2003
Location: Australia
Distribution: Lots ...
Posts: 21,126

Rep: Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120
If you feel you have sufficient documentation of your efforts, raise a bug against CentOS. I'm not a CentOS user, but you might not get a sympathetic response given the release you are running.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
How to add accidentally removed drive from broken RAID 10 bkeahl Linux - Hardware 1 05-08-2016 03:18 AM
[SOLVED] Software RAID (mdadm) - RAID 0 returns incorrect status for disk failure/disk removed Marjonel Montejo Linux - General 4 10-04-2009 06:15 PM
Software RAID 6 Drives Marked As Spares And The md Drive is inactive jc_cpu Linux - Server 11 06-04-2009 11:06 AM
Raid 1 degraded, removed "bad" drive, replaced... chownuseradd Linux - Newbie 2 01-22-2008 04:17 AM
I removed a drive from my raid array, how can I get it to boot alone? abefroman Linux - Hardware 2 10-04-2005 08:40 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > CentOS

All times are GMT -5. The time now is 07:05 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration