Help with degraded RAID1
First, my setup:
-- CentOS 6.4 with latest updates -- Two 3GB SATA drives sda and sdb -- Created 3 RAID1 partitions: ---- md0: /root, with GRUB setup on both drives per http://wiki.centos.org/HowTos/SoftwareRAIDonCentOS5 ---- md1: /, swap, /tmp partitions using LVM ---- md2: Encrypted LVM volume using LUKS After the setup is done, I tested the RAID by pulling one drive and boot with the other, it works. Then I followed steps in http://forums.fedoraforum.org/showpo...10&postcount=6 to prevent the OS from asking me about passphrase during boot and wrote two scripts to open and close luks, the script works too. Now the problem: yesterday, I proceed to setup samba on the server, and share the encrypted mount, this works too. I also run hdparm, vgdisplay, etc to collect some information on the final system, and I think I forget to run the close luks script (which umount and run cryptsetup luksClose) before powering off. Today when I started the machine, all hell has broken loose, the RAID1 will only start in degraded mode, and it randomly pick one of the RAID partition from sda or sdb, here's an example result of "cat /proc/mdstat": Code:
Personalities : [raid1] 1. What is the reason for this issue? Is it caused by not umount and running cryptsetup luksClose on the encrypted LVM volume? 2. How do I fix it? Thanks |
This does not seems encryption issue ? Does the problem only with RAID1 or with anything else ?
Thanks |
Code:
mdadm --add /dev/md0 /dev/sdb1 |
Quote:
|
All times are GMT -5. The time now is 06:55 AM. |