Dear community,
I am searching since days how I can correctly set up a Open Suse Tumblweed with 2 SSDs with a BTRFS raid 1 on an UEFI System in my Virtual Box.
I have read plenty of threads here in the forum but it didn’t help me.
Three other sources I stumbled over in this order were:
https://www.thomas-krenn.com/de/wiki/Ubuntu_Software_RAID_mit_redundanten_UEFI_Boot_Einträgen
https://seravo.fi/2016/perfect-btrfs-setup-for-a-server
https://www.complang.tuwien.ac.at/anton/btrfs-raid1.html
The interessting thing is, that the third source is already referencing the two former sources.
What I did:
[ol]
[li]Use Open Suse Tumbleweed DVD iso file to boot from [/li][li]Create GPT partition table on /dev/sda with the installer GUI[/li][LIST=1]
[li]/dev/sda1 with 500 MB, formatted with vfat, mounted at /boot/efi as suggested from the installer [/li][li]/dev/sda2 with 40 GB, formatted with btrfs, mounted at / as suggested from the installer (including all the suggested subvolumes) [/li][/ol]
[li]Let the installer do its job. [/li][li]Add second hard drive (/dev/sdb) (with the same size) when the machine is switched off [/li][li]Copy GPT partition table from /dev/sda to /dev/sdb [/li][li]Use [/li]```
btrfs device add /dev/sdb2 /
and
btrfs fi balance start -mconvert=raid1,soft -dconvert=raid1,soft /
to create the btrfs raid1.
[/LIST]
Until here it is clear to me and I think this is the correct way to go. But now I have the problem, that if I remove Disk1 (dev/sda) from the virtual machine, I can't boot anymore, because the bootloader (GRUB 2) is located on this disk.
So I tried the proposals from the given sources from above but unfortunately none of them worked for me.
I tried this https://www.thomas-krenn.com/de/wiki/Ubuntu_Software_RAID_mit_redundanten_UEFI_Boot_Eintr%C3%A4gen:
sudo umount /boot/efi
sudo mkfs.vfat /dev/sdb1
sudo parted /dev/sdb set 1 boot on
sudo mount /dev/sdb1 /boot/efi
sudo grub-install --bootloader-id RedundancyBootloader /dev/sdb
sudo umount /boot/efi
sudo mount /boot/efi
Now I did a shutdown of the virtual machine and removed Disk1. Then I tried to start it again.
No luck with this one. From the EFI menu I can get to the bootloader of Disk2 and select Open Suse Tumbleweed but it gets stuck afterwards. I just see 3 green little squares.
I did a rollback of the virtual machine.
Then I combined https://seravo.fi/2016/perfect-btrfs-setup-for-a-server with https://www.complang.tuwien.ac.at/anton/btrfs-raid1.html because I wanted to have EFI and the first mentioned source here is apparently for BIOS.
I copied the bootloader partition with
dd if=/dev/sda1 of=/dev/sdb1
I also changed in my /etc/fstab from
UUID=3d0ce18b-dc2c-4943-b765-b8a79f842b88 / btrfs defaults 0 0
to
UUID=3d0ce18b-dc2c-4943-b765-b8a79f842b88 / btrfs degraded,strictatime 0 0
(I have adapted the UUID) and I commented out in my /etc/fstab to mount the vfat boot partition because the UUID is not the same on /dev/sda1 and /dev/sdb1. Otherwise the system would not boot either.
I added the options
rootflags=degraded
to the two lines in the file /etc/grub.d/10_linux and did a
grub2-mkconfig -o /boot/grub2/grub.cfg
.
Then I did a shutdown and removed Disk1 again.
No luck with this one either. From the EFI menu I can get to the bootloader and select Open Suse Tumbleweed but it gets stuck afterwards. I just see 3 green little squares.
Is there a better tutorial how this should done? Can someone enlighten me or can someone give me some hints?
Every help is appreciated!
Thank you very much!