Thumbleweed cannot write UEFI loader

Hi! I have laptop with UEFI. Tumbleweed already was installed on it (by me) and all was fine.
Recently I’ve bought SSD and installed it into this laptop. So I decide to install OS on SSD to take speed advantage.
I’ve done new Tumbleweed installation to SSD from USB-stick, but I want to keep previous OS (TW too) on HDD for some time.

Now, when laptop boots, I see GRUB menu where default boot option is previous TW and to boot new one I have to manually select it from the list.

I was able to boot new TW, go to YaST->Loader, on “Loader settings” tab there I see several boot options:


- openSUSE Tumbleweed                                                                       // new TW, I suppose
- openSUSE Tumbleweed, with Linux <kernel version>
- openSUSE Tumbleweed (on /dev/sda3)                                                  // previous TW
- openSUSE Tumbleweed, with Linux <kernel version> (on /dev/sda3)         // previous TW
…
- openSUSE Tumbleweed, with Linux <kernel version> (on /dev/sda3)         // previous TW

Then I press OK, and after some time YaST->Loader window silently closes, but after reboot I see boot options are like:


- openSUSE Tumbleweed                                                                      // this one boots old TW from HDD
- openSUSE Tumbleweed, with Linux <kernel version>                             // kernel from old TW         
- openSUSE Tumbleweed (on /dev/sdb2)                                                 // this one boots new TW from SSD
- openSUSE Tumbleweed, with Linux <kernel version> (on /dev/sdb2)        // kernel from new TW
…

Looks like loader from old TW is getting control. Inside BIOS setup, on “Boot sequence” page there is only one record “opensuse-secureboot”.

In my system /dev/sda3 is root partition of prev TW on HDD and /dev/sdb2 is root partition of new TW on SSD.
sda1 and sdb1 are /boot/efi with vfat, in order for old and new systems.

My language isn’t english, so menu and window titles are might be isn’t 100% accurate, sorry for this.

So what should I try that new TW was be able to write his own loader into EFI? May enabled secure-boot prevents this?

Probably not.

Check if there are BIOS settings for boot order.

I have two UEFI computers – call them A and B.

With A, when I install opensuse, it next boots to the freshly installed system.

On B, when I install opensuse, it still boots the way it previously did. I can use “efibootmgr” to change the boot order to what I want. And I can then use “efibootmgr -v” to see the boot order, and it look good. But when I reboot, the BIOS changes it all back to what it previously was. Going into BIOS settings, and changing boot order there, is the only thing that works.

Thanks for your response!

Something like this happened to me.
I’ve created new bootrecord with


# efibootmgr -c -d /dev/sdb -l '\EFI\opensuse\shim.efi' -L tumbleweed

After that


  # efibootmgr -v

Shows to me:


BootOrder: 0007,0006 
…
Boot0006* opensuse-secureboot   HD(1,GPT,<some symbols>)/File(\EFI\opensuse\shim.efi)
Boot0007* tumbleweed    HD(1,GPT,<OTHER symbols>)/File(\EFI\opensuse\shim.efi)
…

Than I have rebooted laptop and entered boot menu. There were two EFI records shown:

  • tumbleweed
  • opensuse-secureboot

I’ve choosed first one (tumbleweed), but again old loader had control.
I’ve selected system from /dev/sdb2, have loaded into new TW and executed # efibootmgr -v again. And now output was


BootOrder: 0007,0006 
…
Boot0006* opensuse-secureboot   HD(1,GPT,<some symbols>)/File(\EFI\opensuse\shim.efi)
Boot0007* tumbleweed    HD(1,GPT,<SAME symbols>)/File(\EFI\opensuse\shim.efi)
…

So Boot0006 and Boot0007 are differs only with their labels now.

I have not found something like “drive order” in BIOS setup. There is page called “Boot Sequence” there, where I can reorder boot records (opensuse-secureboot, tumbleweed), but if “tumbleweed” record had changed and became duplicate of “opensuse-secureboot” after reboot, I think it makes no sense.

My laptop is Dell Vostro 5468. I’ve already updated firmvare to latest 1.3.0 from Dell website.

I would suppose this laptop require that ESP (EFI system partition) must be only first drive’s first partition (HDD, in my case), but when I connect bootable USB-stick, bootrecord from this stick normally appears in boot menu… Maybe it works in other way with USB devices, don’t know.

Or pluggin/unplugging device forces to rescan ESP on it? Before create this topic I’ve tried reinstall system to SSD three times with different bootloader settings. One of it (second) was with EFI and disableld secureboot support, after that there was “opensuse” bootrecord (wasn’t works too), which I’ve manually delete from BIOS setup later.

I’ll give a bit more detail on my two UEFI computers.

One of them is a Lenovo. And that’s the one that I previously described. I have to change the boot order in the BIOS for it to stick.

The other is a Dell Inspiron 660. I’m not sure if the firmware is similar to yours. It gives me a different kind of problem from the Lenovo. I was lucky, in that my initial setup worked.

Here’s what some folk experience with the Dell Inspiron 660:

  1. Install linux.
  2. Boot to linux (works fine)
  3. Choose Windows from the grub menu (works fine)
  4. Boot to linux – oops, what happened to that grub menu. Only Windows is recognized.

People tend to blame Microsoft for this. But it is really the Dell UEFI firmware. On reboot, the firmware removes all EFI boot entries except the most recently used ones. It removes from NVRAM. It doesn’t touch what’s on disk. So the Windows boot entry was removed by the firmware upon booting to linux.

This isn’t a problem, as grub still knows how to boot Windows (chaining to the efi bootloader for windows still on disk).

Windows then notices that there isn’t an NVRAM boot entry for Windows. So it re-installs that entry.

Now, reboot to get to linux. And, during reboot, the firmware removes the linux NVRAM entry, because the Windows entry was used more recently. So no way to boot to linux. There are various ways around this that google will turn up.

I got lucky. When I purchased that Dell, I also purchased an additional hard drive. So, instead of using the EFI partition that Windows uses, I created a new EFI partition on the new drive. And I installed opensuse on that new drive. And that arrangement worked. It seems that the Dell firmware is willing to keep one NVRAM entry per drive (or, really, per EFI partition). So because my opensuse entry is on a different EFI partition, booting Windows does not remove it. And booting opensuse does not remove the Windows entry. But I have to be careful to not have more than one entry per EFI partition.

It seems that you might be having a similar issue. I can’t be sure, because I think Dell uses more than one firmware vendor. From your description, it maybe that the system is keeping one NVRAM entry for your internal drive and one for the removable external drive. And it might only be allowing one for each.

A footnote: I recently started using KVM (for virtualization). So, with KVM, I installed “OVMF” to provided EFI firmware for virtual machines. The OVMF firmware seems to be close to the reference firmware from Intel. And that OVMF firmware behaves the same way as that Dell firmware. During reboot, it removes all but the most recently used NVRAM entries.

For now, I ended up with assumption that there is some bug or incompatibility somewhere between efibootmgr and Dell’s implementation of EFI firmware, which
results that efibootmgr -v shows one, but after reboot, system actually boots another way, ignoring edits from efibootmgr in some cases.

I’ve disabled both SATA drives in BIOS setup, and then have enabled again. After this opensuse-secureboot bootrecord has disapeared.
But I get another bootrecord: <HDD vendor>-bla-bla-bla, seems when drive was re-enabled firmware has found EFI partition on it and created this record.

I’ve installed TW from USB to SDD again, making sure first partition on SSD has EFI file system id and mounted as /boot/efi. Bootloader settings was: GRUB with EFI, Secureboot enabled.
After reboot, I’ve get new bootrecord <SSD vendor>-bla-bla-bla at the records list end, first record was opensuse-secureboot.
I’ve proceeded with default first option and again got GRUB menu which was created by old TW on HDD. For now item “openSUSE Tumbleweed (on /dev/sdb2)” there wasn’t worked.

Reboot. Selected <SSD vendor>-bla-bla-bla, wasn’t works: something like "EFI\boot\grbux64.efi not found".

I was able to boot fresh TW (from SSD) using “Boot Linux system” option from USB-stick. I’ve looked at efibootmgr -v, created new bootrecord with efibootmgr -c …, made sure that GUID in new record (inside HD()) differs from opensuse-secureboot and <HDD vendor>-bla-bla-bla records and matches with GUID from <SSD vendor>-bla-bla-bla record.
Note: <SSD vendor>-bla-bla-bla refers to EFI/boot/bootx64.efi as loader, while records created by openSUSE are reffer to \EFI\opensuse\shim.efi.

Reboot. Selected this new bootrecord. Old GRUB menu again. efibootmgr -v again has shown same disk GUIDs for new record, opensuse-secureboot and <HDD vendor>-bla-bla-bla

After that I’ve mounted /dev/sda1 as /boot/efi in fresh system (instead of /dev/sbd1), went to YaST->Bootloader, pressed [OK]. And after reboot there finally was new GRUB menu with new TW as default option.

P.S. Sorry for my grammar, hope you can understand what I wrote :slight_smile:

Some machines have been reported to do this in some cases always setting UEFI back to boot Windows. Maybe check and see if there is a UEFI update

This is standard. You will get this is the firmware cannot find anything else on that disk, but does find an EFI partition.

It will actually use “\EFI\Boot\bootx64.efi”.

On a Windows box, that file is usually a copy of the Windows efi boot file. But you can copy the opensuse boot file there if you wish. If you are not using secure-boot, simply copy “grubx64.efi” to there. If you are using secure-boot, copy “shim.efi” there (and rename as “bootx64.efi”). But “shim.efi” probably also need “grub.cfg” and “grub.efi” from the “opensuse” directory. So copy those there too. Or copy “fallback.efi” from “/usr/lib64/efi/” to that same “\EFI\Boot” directory.

I had exactly the same problem with an asus computer and must say that the explanation provided by nrickert is the clearest that I have seen.

People tend to blame Microsoft for this.  But it is really the Dell UEFI  firmware.  On reboot, the firmware removes all EFI boot entries except  the most recently used ones.  It removes from NVRAM.  It doesn't touch  what's on disk.  So the Windows boot entry was removed by the firmware  upon booting to linux.

This isn't a problem, as grub still knows how to boot Windows (chaining to the efi bootloader for windows still on disk).

Windows then notices that there isn't an NVRAM boot entry for Windows.  So it re-installs that entry.

Now, reboot to get to linux.  And, during reboot, the firmware removes  the linux NVRAM entry, because the Windows entry was used more recently.   So no way to boot to linux.  There are various ways around this that  google will turn up.

I used the copying and renaming the grub file to \EFI\Boot\ method and this has worked mostly fine for me for the last 2 years. The only issue that I have had is when the grub file was changed - which just involved replacing the original version with the new in \EFI\Boot.