I have RAID 1 BusyBox NAS that is not booting. Disk one reports no SMART issues, but Disk two has lots and why I assume the RAID failed.
I can boot to grub with the both disks installed, or either one plugged in by itself to the SATA port BIOS is looking to boot from.
There are two raid arrays on the disks. One for the operating system (/dev/sda1) with a ext file system. The other for data (/dev/sda3) with a LVM container file system.
The system seems to fail to mount the necessary folders to boot properly. My main goal is simply recovering the data in the LVM container. I've made multiple clones, but all my sector-to-sector clones state there is no superblock for /dev/sda3, only the original shows the proper info so while I am trying to be extremely cautious, it appears I can not work with the clones.
Options to edit Grub
mdadm --examine of the FAILED drive. This image was taken when it was the only drive installed yet does not report anything faulty or missing.
mdadm --examine of the WORKING drive. This image was taken when it was the only drive installed. It shows /dev/sdb1 and sdb3 (not in the photo) as "faulty removed".
If I boot a clone, mdadm --examine /dev/sda1 works properly. But /dev/sda3 reports no md block found. Which is odd because the clones are sector to sector clones.
Here is what I get when I let it boot normally. You can see it does not seem to mount anything and states the container that holds all the data needs to be cleaned.
Any assistance would be VERY appreciated. I promise to reply to any questions quickly if anyone is willing to help me resolve this. Thank you!