Should I be using Nvidia Bumblebee or not

I restartet and tried again:


machine:/ # optirun --status
Bumblebee status: Ready (3.2.1). X inactive. Discrete video card is off.

machine:/ # optirun glxspheres
Polygons in scene: 62464
Visual ID of window: 0x20
Context is Direct
OpenGL Renderer: Gallium 0.4 on NVE7
139.639158 frames/sec - 155.837301 Mpixels/sec
140.885627 frames/sec - 157.228360 Mpixels/sec
141.061661 frames/sec - 157.424814 Mpixels/sec
140.643337 frames/sec - 156.957964 Mpixels/sec

After running glxspheres I get the previous status again:


machine:/ # optirun --status
Bumblebee status: Error (3.2.1): [XORG] (EE) Server terminated successfully (0). Closing log file.

I am using the latest stable Linux kernel 3.18.5 and I have installed Bumblebee for this same kernel.
http://download.opensuse.org/repositories/Kernel:/stable/standard/
http://download.opensuse.org/repositories/X11:/Bumblebee/Kernel_stable_standard/

I do se that some symlinks in /boot are still pointing to the old deleted 3.16.7 kernel. Not sure if this is a problem for Bumblebee:
initrd -> initrd-3.16.7-7-desktop
vmlinuz -> vmlinuz-3.16.7-7-desktop

Edit: Reinstalling the kernel fixed the symlinks. Will try optirun again after a restart.

Well, optirun still showed an error status after running glxspheres. It had notting to do with the broken kernel symlinks.

Error messages in bumblebee aren’t always to be taken serioasly, however OpenGL should be Nvidia.
What does “primusrun glxinfo | grep OpenGL” say?

I did some digging and found out a little more of this error:


machine:/ # service bumblebeed status
bumblebeed.service - Bumblebee C Daemon
   Loaded: loaded (/usr/lib/systemd/system/bumblebeed.service; enabled)
   Active: active (running) since Wed 2015-02-04 09:25:49 CET; 1h 34min ago
 Main PID: 856 (bumblebeed)
   CGroup: /system.slice/bumblebeed.service
           `-856 /usr/sbin/bumblebeed


Feb 04 09:25:49 machine bumblebeed[856]:     7.501619] [INFO]/usr/sbin/bumblebeed 3.2.1 started
Feb 04 09:26:33 machine bumblebeed[856]:    51.020949] [ERROR][XORG] (EE) /dev/dri/card0: failed to set DRM interface version 1.4: Permission denied
Feb 04 09:26:33 machine bumblebeed[856]:    51.020990] [ERROR][XORG] (EE) /dev/dri/card0: failed to set DRM interface version 1.4: Permission denied
Feb 04 09:26:38 machine bumblebeed[856]:    56.751474] [ERROR][XORG] (EE) Server terminated successfully (0). Closing log file.

I have chosen first to try running only with the integrated Intel graphics and have blacklisted nouveau. So I didn’t install bumblebee-nvidia, just regular bumblebee.


machine:/ # primusrun glxinfo | grep OpenGL
primus: fatal: failed to load any of the libraries: /usr/$LIB/nvidia/libGL.so.1
/usr/$LIB/nvidia/libGL.so.1: cannot open shared object file: No such file or directory

Now using bumblebee-nvidia


machine:/ # primusrun glxinfo | grep OpenGL
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: Quadro K1100M/PCIe/SSE2
OpenGL core profile version string: 4.4.0 NVIDIA 346.35
OpenGL core profile shading language version string: 4.40 NVIDIA via Cg compiler
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 4.5.0 NVIDIA 346.35
OpenGL shading language version string: 4.50 NVIDIA
OpenGL context flags: (none)
OpenGL profile mask: (none)
OpenGL extensions:

Looks perfect to me, what performance do you get with “primusrun glxspheres”?


machine:/ # primusrun glxspheres
Polygons in scene: 62464
Visual ID of window: 0x20
Context is Direct
OpenGL Renderer: Quadro K1100M/PCIe/SSE2
65.241642 frames/sec - 72.809673 Mpixels/sec
63.472791 frames/sec - 70.835634 Mpixels/sec
62.815372 frames/sec - 70.101955 Mpixels/sec
64.460585 frames/sec - 71.938013 Mpixels/sec
62.897398 frames/sec - 70.193496 Mpixels/sec

Didn’t notice it before now. Looks like the Intel graphics gets better performance

Intel: 140 frames/sec - 156 Mpixels/sec
Nvidia: 62 frames/sec - 70 Mpixels/sec

Or am I reading the results wrong?

It’s an consequence of primusrun running synced to the displays vertical refreshrate. You should get a different framerate running it this way:

vblank_mode=0 primusrun glxspheres

But this doesn’t mean it renders faster on the display, the display can’t work faster than about 60 frames/sec anyway.

That was something else:
714.691413 frames/sec - 797.595617 Mpixels/sec

How can I control the switch between Intel and Nvidia manually? For instance if I want to disable Nvidia when I’m using only battery power.
Or will it only use nvidia when i use primusrun?

What graphic card is X using? I reckon that X cannot switch between them without restarting X.
I look into /var/log/Xorg.0.conf I would guess that X is running with the Intel graphic card.

Bumblebee, or more correct bbswitch, will keep the Nvidia card powered off unless you’re running a program with primusrun or optirun. You can check the status of bbswitch with the following command:

cat  /proc/acpi/bbswitch

I hope I got that right, I’m not on an optimus system right now.

Your desktop will always run on the Intel chip, primusrun launches a second instance of xorg for the Nvidia card and the output of that instance is piped over to the primary xorg instance.
The log for the Nvidia instance of xorg is in /var/log/Xorg.8.log.

Normally the system runs on the intel chip, and the nvidia chip is turned off to save power.

For running applications on the nvidia chip, you have to use optirun or primusrun, yes.

It is possible to run the X session on nvidia as well, by starting the desktop via optirun/primusrun. But then you cannot power down the nvidia card of course.
If you want to do that, see here e.g.:
https://forums.opensuse.org/showthread.php/490813-Bumblebee-Run-entire-KDE-session-on-NVIDIA