Troubles with software RAID and 11.0

Hello all,

This machine had CentOS 5.1 on it previously, but I decided to try openSUSE 11.0 now that it’s out on it.

The machine has four SATA 250G HDDs, that I have converted into a software RAID (that worked with CentOS). The raid layout was:

md0 RAID array contained 4 linux raid primary partitions (first n cylinders from sda to sdd) all 4 128MB large, setup as RAID1, and I’d mount Ext3 /boot on it.

md1 RAID array also had 4 linux raid primary partitions (the remaining cylinders from sda to sdd) all 4 ~235GB large, setup as RAID5, i’d make that into a LVM volume group and create three volumes, one I’d leave untouched, the other, approximately 200G large would be used to mount the rest of the filesystemn (mount at “/”), and a small bit of it all, about 4G, I’d turn into swap.

This worked perfectly in CentOS 5.1, however openSUSE 11 came out, and I couldn’t resist it.

I tried reinstalling over my existing partitioning (manual edit of existing partitions, and I’d just format relevant bits “/boot”, “/” and “swap”). It installed without problems, I logged on etc. but on first reboot I got a blank screen saying “GRUB”.

After running the “rescue” bit on install DVD it complained that there are no linux partitions on my drive.

OK, I figured that CentOS partitioning bothered it. All well, I’ll reinstall.

And so I did. When I got to partitioning I DELETED partition tables from all disks (Expert button in partitioner) and all LVM stuff. After I got clean disks, I made the same RAID/LVM layout on disks that I had and installed SUSE.

Again the same scenario: instalation was success, so was first log in, but after reboot there was nothing.

OK, I figured not to keep boot partition on a RAID, and I just wanted to test this now so I made a normal Ext3 partition on beggining of sda and made that my boot partition, I made the rest of my disks partitioned like sda but didn’t use first n sectors that contain boot on sda, and the rest was turned into a software raid array contained in a LVM group and partitioned as above (sans boot).

Again the same thing, except this time I had managed to reboot few times (yippie) successfully, until one time it just couldn’t locate the LVM volgroup during boot (dang!).

Finally, I said to my self, I’ll just install SUSE, no LVM, no RAID and see if it’s my disks or wtf

but

Now it just pops out an error in the phase where it checks the system (screen 2 of installation) and drops out of graphics mode. To make things that extra bit fun it covers any text output it spat (I catch something for less than a second) with a ncurses screen containing very informative and helpful “There was an error during install (but we sure as hell won’t tell you what it was)” message in an nice red ncurses dialog box.

All repeated attempts resulted in the same thing. From what I see it goes fubar at some poing whilst checking for existing linux partitions.

A-Ha I think to myself, and enable the onboard fake RAID which effectively (once I add disks into an array) should wipe out partition tables on all disks – or so I think, and try again.

But nope, no sucess. Same spot, same installation error.

After I released the pressure together with ton of expletives, I decided to ask you, good folk of openSUSE forums:

Where did I go wrong and what should I do now?

I’m afraid there is more than just one problem with your config but since I ran into probs installing 11.0 on a RAID 1 system, I can tell you that the installation process left me with a non-functioning GRUB config. Instead of writing the config to the MBR, it wrote to /dev/sdb1 which is totally insame IMO since this partition is one of the two that were forming my md0.

So after installation I had to run Yast -> System -> Configure GRUB and make it write the config to the MBR rather than any partition.

Hope this solves one of your problems.
Regards
Alfred

Hello bmarkovic,

I do not believe there is anything wrong with your configuration. I have been trying to find a solution to this problem. I am using a Mecer PC with dual SATA 250GB hard drives. I had Fedora 9.0 installed as a RAID 1 configuration and was working alright. I was inspecting the trend in the market for Linux and noticed that OpenSUSE was number 2 on the prefered Linux list.

I made arrangements to get a copy of the installation DVD for version 11.0 and am now experiencing the exact same problem you have got.

I cannot believe no-one has really responded to your cry for a solution. Did you ever manage to get a solution to your problem or did you just revert back to CentOS 5.2?

I am considering reverting back to either Fedora 9.0 or else CentOS 5.2 since I do not have the time to manage problems with this distro.

Would be really great to hear from the other forum gurus.