Opensuse 13.2 on Proliant dl360 with B140i sw raid

Dear
Sir/Madam,

I am trying to setup opensuse 13.2 on a Proliant DL360 Gen9 server with a B140i smartarray controller.

in order for the array controller to work in raid mode, it has to be in uefi boot mode.

I can setup the array on the controller fine, and it says i have one logical disk.

However, in the opensuse setup gui, i can only see my two disks as separate sata disks, it does not list my logical drive.

Any help with this would be very appriciated.

Using Linux Raid is not an option, as the /boot partition cannot be on a linux raid partition.

Best Regards
Patric

Patric

B140i is only supported on SLES as Hewlett-Packard made this driver proprietary so you’re out of luck.

In theory you could compile it on your own and integrate that driver on your openSUSE install but that requires a fair bit of knowledge and making your own driver disk. It would simply be easier to use SLES if you want to use a SUSE based product.

It looks like this RAID hardware requires drivers.I don’t know if the SUSE driver would work with openSUSE

Why? It can. Booting with two independent disks should be fairly simple. Booting in fake-RAID mode may be possible but I’m afraid won’t be supported by installer.

Unless you have good reasons to use fake RAID I’d simply go for two independent disks.

Not an option for my customer, they need full redundancy of all files across two disks.

But thank you for the reply!

BR
Patric

Many thanks for your reply.

I tried all the sles driver disks, but like you say, this did not work.

Might get some help from a knowledagble guy at my customer to try and create a driver disk from the source driver we found on the net.

I guess hp wants you to buy sles :frowning: , any help with details of how i could create a driver disk would be very appriciated,

Many thanks!

BR
Patric

Will this be a system that is connected to the internet serving something, such as a web server?

If not, you could download SLES12 and use that as it will not “stop working” after the evaluation period, it only stops receiving updates after the initial 60 days.

What’s wrong with Linux MD?

If I were to make a guess automatic degradation of the RAID array and booting from the second disk.

I haven’t used Linux SW RAID in years, does it setup grub for both drives automatically now and manages to boot from a different drive if the other one is degraded and system is rebooted?

YaST detects if installation is on RAID1 and offers to install bootloader on both disks. It needs to be explicitly enabled, it is one check box. BIOS usually will try each disk in order, so should fallback to next available. EFI redundant setup is not automated currently.