Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
How to solve the issue?
Will Rescue Disk Solve the issue?
Do I need to install OS from scratch?
Just for your info, I had configured RAID 0+1 from BIOS.
Installed LVM..So I guess the incorrect LVM Partitioning may be the culprit.
Last edited by your_shadow03; 02-18-2009 at 11:31 PM.
I went into rescue mode.
Mounted to chroot /mnt/sysconfig
Now I am in #ssh shell.
I checked with /boot/grub/grub.conf and found an entry saying:
Sorry I forgot to tell you something.. I had already 2 SAN Switch with name /dev/sda, /dev/sdb and /dev/sdc.I installed a new OS on /dev/ccissd0p0 local partition.
While Installation I choose /dev/ccissd0p0 partition and formatted it leaving the rest connected to MSA.
What I think is #boot /dev/sda may be clashing with /dev/sda of MSA?
Correct me if I am wrong
I need your help and am sure you gonna help me in this regard.
I looked into another machine which is running without any issue and is also connected to MSA.
The Entry for running machine includes:
title Red Hat Enterprise Linux AS (2.6.9-67.0.22.EL)
kernel /boot/vmlinuz-2.6.9-67.0.22.EL ro root=LABEL=/ idle=poll
title Red Hat Enterprise Linux AS (2.6.9-67.0.22.ELsmp)
kernel /boot/vmlinuz-2.6.9-67.0.22.ELsmp ro root=LABEL=/ idle=poll
title Red Hat Enterprise Linux AS (2.6.9-67.ELsmp)
kernel /boot/vmlinuz-2.6.9-67.ELsmp ro root=LABEL=/ idle=poll
title Red Hat Enterprise Linux AS-up (2.6.9-67.EL)
kernel /boot/vmlinuz-2.6.9-67.EL ro root=LABEL=/ idle=poll
My Machine Entry was also same except on my machine it says:
Why is # being put to machines connected to MSA?
Last edited by your_shadow03; 02-19-2009 at 04:46 AM.