LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 03-21-2017, 08:55 AM   #1
Jethro_UK
LQ Newbie
 
Registered: Mar 2017
Posts: 1

Rep: Reputation: Disabled
MDADM RAID always seems to cause boot problems when partitions change ????


I'm running Debian Wheezy 64-bit. It was running perfectly with a 2-disk RAID1 array. Booting up, rebooting, shutting down, booting up. No problem.

I decided I wanted to add some disks, to create a second RAID array. Oh my

For some reason, my setup REALLY seems to dislike partitions being changed. I think I know what the issue is centred around, and am after some advice as to the two possible fixes ...

When the boot fails, there's a message about a job running ... it seems to be waiting for a disk to be available, and after 90s, it drops to emergency mode.

Running journalctl -xb in emergency mode, I saw entries that stated the job dev-md0 had failed. This then led to the RAID array md0 not being available, and a whole list of dependency failures.

The same then happened for RAID1.

At this point, RAID0 was created with a missing drive, and RAID1 was fully assembled and clean.


HOWEVER, blkid shows the disks as being available.

I tried an update-initramfs, but the problem was still there.
I tried CONFIG_FHANDLES=Y on boot. Still no joy.
I tried renaming mdadm.conf mdadm.old. Still no joy.
Finally, I commented out /dev/md0 and /dev/md1 in etc/fstab and bingo !!! Perfect boot.

I then used mdadm --assemble --scan which immediately found and built the RAID arrays.

I then used Webmin to (re)create the mounts that were in /etc/fstab.

So my question(s) are derived from this thought:

Clearly there is something in my RAID config which refuses to accept that the disks which comprise the RAID are present. I believe this has NOTHING TO DO WITH THE RAID BEING ASSEMBLED WITH A MISSING DISK. The reason I state that is that the boot log showed errors about my RAID1 array - which had both disks present and clean.

1) What is it in the config and how can I change it to stop this in the first place ?

2) If I can't stop it, what can I do to tell the system to get past the error and boot ?

cheers guys
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Recover mdadm RAID after failure during RAID level change Caetel Linux - General 1 11-07-2013 10:38 PM
Software (mdadm) Raid 6 rebuild problems jhon614 Linux - Server 2 12-31-2011 01:59 PM
mdadm: Rebuild a software raid from drives with existing partitions Boeby Linux - Server 3 04-26-2011 01:32 PM
MDADM screws up the LVM partitions? Can't mount the LVMs after mdadm stopped alirezan1 Linux - Newbie 3 11-18-2008 04:42 PM
Software RAID with mdadm problems... robbow52 Debian 2 04-19-2004 07:06 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 03:48 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration