Software Raid Setup Ok - Reboot fails on disk failure test
Hi,
I have set up a software raid-1 with 2 IDE disks successfully. Everything seemed to work well until I tried to imitate a disk failure. I turned the box off and unplugged 1 HD. After I did that it was impossible to reboot (the bootloader doesn't appear). It didn't even work to reboot with a floppy. Now I have plugged both drives again and everything works again. Why did this failure test not work?
Any help is appreciated.
Below you can find some info about my setup:
----df ------------ ------------------------
Filesystem 1K-blocks Used Available Use% Mounted on
/dev/md2 7052308 2554648 4139420 39% /
/dev/md0 100890 14198 81483 15% /boot
/dev/md3 68754336 325860 64935952 1% /home
none 256916 0 256916 0% /dev/shm
/dev/cdrom1 551580 551580 0 100% /mnt/cdrom1
----- cat /proc/mdstat -----------------
Personalities : [raid1]
read_ahead 1024 sectors
md3 : active raid1 hde5[0] hdg5[1]
69850496 blocks [2/2] [UU]
md1 : active raid1 hde3[0] hdg3[1]
1052160 blocks [2/2] [UU]
md2 : active raid1 hde2[0] hdg2[1]
7164800 blocks [2/2] [UU]
md0 : active raid1 hde1[0] hdg1[1]
104192 blocks [2/2] [UU]
unused devices: <none>
|