However, no such luck. I also had to make a correction to the Wiki I linked (Leap section, 2nd set of Symlink commands). There’s also a few things off about the order of things that isn’t right, at least for Skylake… Can’t remove drm-kmp-default and reboot, or there’s no booting at all.
Has anyone gotten it to work right? What changes did you make to the procedure?
Bumblebee IS running with no errors, but trying to run optirun give me the following:
dorian@linux-amu6:~> optirun -vvv glxspheres
2876.801993] [DEBUG]Reading file: /etc/bumblebee/bumblebee.conf
2876.802513] [INFO]Configured driver: nvidia
2876.802912] [DEBUG]optirun version 3.2.1 starting...
2876.802945] [DEBUG]Active configuration:
2876.802958] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
2876.802972] [DEBUG] X display: :8
2876.802991] [DEBUG] LD_LIBRARY_PATH: /usr/X11R6/lib64:/usr/X11R6/lib
2876.803018] [DEBUG] Socket path: /var/run/bumblebee.socket
2876.803042] [DEBUG] Accel/display bridge: auto
2876.803064] [DEBUG] VGL Compression: proxy
2876.803088] [DEBUG] VGLrun extra options:
2876.803111] [DEBUG] Primus LD Path: /usr/lib64/primus:/usr/lib/primus
2876.803213] [DEBUG]Using auto-detected bridge virtualgl
2876.982409] [INFO]Response: No - error: Could not load GPU driver
2876.982424] [ERROR]Cannot access secondary GPU - error: Could not load GPU driver
2876.982430] [DEBUG]Socket closed.
2876.982448] [ERROR]Aborting because fallback start is disabled.
2876.982458] [DEBUG]Killing all remaining processes.
I don’t have Optumus hardware, so can’t assist directly unfortunately. Hopefully, someone more knowledgeable than I on this subject can offer some meaningful guidance. However, if you don’t get Bumblebee working for you, then nvdia-xrun might be a viable option to pursue perhaps. Anyway, some interesting information on the subject in the following links:
My hardware is different (Intel Core i7-7820HK KabyLake-CPU, Intel HD Graphics 630, NVIDIA GeForce GTX 1060 graphics card) to yours. I did a fresh installation (on formatted partitions) of openSUSE Leap 15.0 and there is no drm-kmp-default package. So probably not all what i did will fit for your setup.
However here is where i deviated from the guide you mentioned:
I had to install “xf86-video-intel-2.99.917+git781.c8990575-lp150.1.11.x86_64.rpm” from the OSS-repository otherwise the system would boot into a black screen.
Thanks susejunky. I did indeed use the Leap 15 repo the first time I tried. Just by fluke I tried to change the repo to 15, which didn’t work but 15.0 did.
I did not however try installing the Intel RPM you posted. I’m not entirely sure it would work on Skylake but I’ll reinstall openSUSE and give it a shot.
I would really like to get it going because I like openSUSE, but not having switchable graphics is a dealbreaker for me. The Leap 15 instructions in the SDB:Bumblebee page don’t work and are kind of a mess to use. I would like to update the page with working instructions if I can get it going. I feel that people aren’t maintaining it. I also think that perhaps a separate page for Leap 15 would make it much cleaner.
I’ll report back my findings after I try it out. Thanks again.
I tried a couple more times with the information provided by susejunky. I also followed the other thread susejunky started with updated information on getting Optimus to work with Leap 15. Every time, it seemed to be working, but never really did. Sometimes the GPU would turn off as it should on boot, it would turn on and run glxspheres perfectly and then turn off, but it would not run any Steam games, or Blender, or Kdenlive. Creating all the symlinks did not help. The only thing I could get to work was the game Rimworld which is 32-bit.
I installed Manjaro alongside my other distros in a new partition, and I was able to install the Nvidia drivers and get optirun working perfectly with this one command:
sudo mhwd -a pci nonfree 0300
After that and rebooting, everything works perfectly in Manjaro: Steam games, Blender, Kdenlive, etc… And the GPU properly powers off when done as confirmed with a Gnome extension and tlp-stat. There’s also a GUI option to do the same thing.
After that I deleted the openSUSE partitions. If it isn’t supported at all and takes that much work to get it going, I don’t think it’s worth it.
Thanks for you assistance but I no longer need assistance as I’ve deleted everything openSUSE from my systems.
I apologize for the delay in posting this, but after being told by someone at openSUSE that they don’t care about Optimus support, I had no desire to try a 7th time or think of updating this thread… For someone in a high position to say such a thing makes me question and dislike the entire brand.
I tried the official guide and failed. During several attempts I got results from not working at all to working with some kind of software rendering (instead of accelerated GPU rendering with nvidia driver).
Then I adopted the old guide and succeeded. Obviously, someone is still maintaining the unsupported X11:Bumblebee repository, but why is official SDB databbase pushing the “new method” (that doesn’t work anyway) is beyond me. The command sequence that worked for me was (as root):
zypper in patterns-devel-base-devel_kernel
zypper in nvidia-bumblebee nvidia-bumblebee-32bit
echo "blacklist nvidia" >> /etc/modprobe.d/99-local.conf
systemctl enable dkms
systemctl start dkms
mkinitrd
reboot
After reboot, edit/etc/bumblebee/bumblebee.conf as root and set the following (lines that are not mentioned here, should be left unchanged, not deleted):
[bumblebee]
TurnCardOffAtExit=true
Driver=nvidia
Finally, restart the bumblebeed deamon, as root:
systemctl restart bumblebeed
And that should make it. Test as usual (as normal user):
I confirm that this is working on my Dell Inspiron 7559. The official guide did not work for me either.
I would also like to mention here that anyone who uses Skylake CPU will have to append either
acpi_osi="!Windows 2015"
or
acpi_osi=! acpi_osi="Windows 2009"
to the list of kernel parameters at boot. This now seems to be common knowledge and is mentioned in the official guide but I still want to mention this because this is extremely annoying bug that kept me from installing bumblebee for more than a year.
Yes, this is basically correct. But if you want to always run an application on discrete graphics you can edit the corresponding <application>.desktop file in /usr/share/applications/.
For instance, to always run Firefox on discrete graphics edit file /usr/share/applications/firefox.desktop so that the
Exec=firefox %u
line reads instead:
Exec=optirun firefox %u
How come the same laptop/graphics card get going with Ubuntu 18.04 and not Opensuse?
On Ubuntu 16.04 the nvidia-smi command shows the processes.
Ubuntu 18.04 uses a different method to drive discrete graphics, apparently a variant of the so-called Nvidia-PRIME; that method is not readily available for openSUSE currently.
Cannot comment on 16.04, not sure that it already met the prerequisites for Nvidia-PRIME.
no common knowledge for me.
Have a Thinkpad t460p, and the next thing I want do do is activating the nvidia card.
It worked with 42.x, but after seeing the new guide, I became scared. also all the manual links that should be created …
Glad I found this thread now
however, before I start to try, how can I figure out which acpi_osi variation I have to use as boot parameter?
Maybe your laptop has an i7-6700HQ or i5-6300HQ, both Skylake processors, so the acpi_osi options just mentioned should work; those are essentially equivalent and say to the CPU something like “Hey! This is NOT Win10”.
Write again here if either doesn’t work.
By the way, your Nvidia card should be active already with the default install and the Nouveau driver; try to issue in a terminal:
Thanks for the info, so I can either write
acpi_osi="!Windows 2015"
or
acpi_osi=! acpi_osi=“Windows 2009”
since they are the same?
And I do not need to find the right variant for my cpu (which is skylake)
yes! (does this also work with nvidia card, why would I than need bumblebee?)
glxspheres
Polygons in scene: 62464 (61 spheres * 1024 polys/spheres)
No protocol specified
Visual ID of window: 0x17c
Context is Direct
OpenGL Renderer: Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2)
60.562097 frames/sec - 67.587301 Mpixels/sec
59.935491 frames/sec - 66.888008 Mpixels/sec
60.006988 frames/sec - 66.967799 Mpixels/sec
....
DRI_PRIME=1 glxspheres
Polygons in scene: 62464 (61 spheres * 1024 polys/spheres)
No protocol specified
Visual ID of window: 0x17c
Context is Direct
OpenGL Renderer: NV118
60.853203 frames/sec - 67.912174 Mpixels/sec
60.009812 frames/sec - 66.970951 Mpixels/sec
59.985999 frames/sec - 66.944374 Mpixels/sec
60.016086 frames/sec - 66.977952 Mpixels/sec
....
my primary usecase for the video card will be to run Davinci Resolve since I started using it to render out techtalks from UG meetups,
but on my Mac with only a intel card rendering takes so long to render video.
since Davinci Resolve has also a Linux version, I would like to give it a try.
also, the thinkpad has a i7-6700HQ CPU @ 2.60GHz while the mac has a i7-4870HQ CPU @ 2.50GHz
so my hope is, that this will go faster on the t460p with nvidia card activated than on the Mac.