Bumblebee problems... Works, but not well/for all things.

Hi everyone,
So I’m having fun trying to set up bumblebee/primus on my GE70-0ND (MSI) with GT660M.

I’ve been tinkering with this for a day now and made no/little progress.
The problem I’m having is that primusrun doesn’t seem to work properly, and while optirun does work, various games (steam and otherwise) give errors or perform as though they’re on the integrated graphics.

To get to a clean slate for troubleshooting, I’ve just purged ALL traces of nvidia/primus/bumblebee from my system and done a fresh install from Overman79’s repository of the nvidia drivers, primus, virtualGL, and bumblebee/bbswitch.
Note I installed 64 AND 32 bit versions of everything, as I’ll be running both 64 and 32 bit apps with the dedicated card.

the ONLY change I made is editing /etc/bumblebee.conf to use yuv transport instead of proxy, and uncommenting the libnvidia-tls line in /usr/bin/primusrun to try to fix the ELFclass error.

bbswitch is behaving properly, as the status light on my notebook toggles on when I use optirun to run things and switches off when the app closes.
All seems fine when running glxspheres with optirun:


~/Minecraft> optirun glxspheres
Polygons in scene: 62464
Visual ID of window: 0x21
Context is Direct
OpenGL Renderer: GeForce GTX 660M/PCIe/SSE2
272.869073 frames/sec - 304.521885 Mpixels/sec

however, if I try with primusrun, I get the following:


>primusrun glxspheres
Polygons in scene: 62464
Visual ID of window: 0x21
Xlib:  extension "NV-GLX" missing on display ":0".
Context is Indirect
OpenGL Renderer: Mesa DRI Intel(R) Ivybridge Mobile 
60.956771 frames/sec - 68.027756 Mpixels/sec
59.437251 frames/sec - 66.331972 Mpixels/sec
57.946127 frames/sec - 64.667877 Mpixels/sec

optirun -b primus does seem to behave, but the FPS is on par with the integrated graphics:

~/Minecraft> optirun -b primus glxspheres
Polygons in scene: 62464
Visual ID of window: 0xaf
Context is Direct
OpenGL Renderer: GeForce GTX 660M/PCIe/SSE2
51.554694 frames/sec - 57.535038 Mpixels/sec

However, when I try to run a “real” program, I get errors. For example:

~/Minecraft> primusrun steam
Couldn't find dpkg, please update steamdeps for your distribution.
Running Steam on opensuse 12.3 64-bit
STEAM_RUNTIME is enabled automatically
Installing breakpad exception handler for appid(steam)/version(1367621987_client)
primus: fatal: failed to load any of the libraries: /usr/$LIB/nvidia/libGL.so.1
libnvidia-tls.so.319.17: wrong ELF class: ELFCLASS64
[2013-05-25 19:35:51] Startup - updater built Apr 30 2013 10:04:24
Installing bootstrap ~/.local/share/Steam/bootstrap.tar.xz
Couldn't find dpkg, please update steamdeps for your distribution.
Running Steam on opensuse 12.3 64-bit
STEAM_RUNTIME has been set by the user to: ~/.local/share/Steam/ubuntu12_32/steam-runtime
Installing breakpad exception handler for appid(steam)/version(1367621987_client)
primus: fatal: failed to load any of the libraries: /usr/$LIB/nvidia/libGL.so.1
libnvidia-tls.so.319.17: wrong ELF class: ELFCLASS64
[2013-05-25 19:35:53] Startup - updater built Apr 30 2013 10:04:24

this isn’t optimal though as steam doesn’t need advanced GL to run, so I specify the optirun parameter on a per-game basis.

Here’s where I get more problems, for example Trine 2 and Shatter, with primus bridge (optirun -b primus) has terrible (~10) FPS. However, if I try virtualGL, I get:

[VGL] ERROR: Could not open display :8.

Similarly, TF2 performs terribly with optirun -b primus, but runs wonderfully with optirun -b virtualgl (and doesn’t give **** about not opening display :8)

I tested minecraft as a non-steam game and it suffers a similar fate, poor FPS when run with -b primus and an error when run with -b virtualgl:


org.lwjgl.LWJGLException: Could not choose GLX13 config
    at org.lwjgl.opengl.LinuxDisplayPeerInfo.initDefaultPeerInfo(Native Method)
    at org.lwjgl.opengl.LinuxDisplayPeerInfo.<init>(LinuxDisplayPeerInfo.java:52)
    at org.lwjgl.opengl.LinuxDisplay.createPeerInfo(LinuxDisplay.java:684)
    at org.lwjgl.opengl.Display.create(Display.java:854)
    at org.lwjgl.opengl.Display.create(Display.java:784)
    at org.lwjgl.opengl.Display.create(Display.java:765)
    at net.minecraft.client.Minecraft.a(Minecraft.java:399)
    at asq.a(SourceFile:56)
    at net.minecraft.client.Minecraft.run(Minecraft.java:746)
    at java.lang.Thread.run(Thread.java:722)
--- END ERROR REPORT da1d0fc7 ----------

Running games with primusrun instead throws more wrong ELFCLASS errors. The only reference I can find to this is that I don’t have the 32 bit libraries installed, but I explicitly made sure to do so in YAST when selecting packages.

It looks to me like optirun is working with VirtualGL, but because primus isn’t behaving properly, I cannot get all games running or working properly.

Has anyone overcome this problem before?

Update: I fixed the NV-GLX missing error with by escaping the $LIB in primus_libGL in /usr/bin (that is, change the /usr/$LIB/foo part to /usr/’$LIB’/foo). However, primusrun still has atrocious FPS.

ELFclass errors can be resolved by copying /usr/bin/primusrun to /usr/bin/primusrun32 and editing it to remove references to lib64. (still suffers from the FPS problem)

I’ve found that part of the problem with ‘unable to open display :8’ in VGL mode comes from games that have launchers, as optirun kills :8 when the launcher exits.

Workaround seems to be running Steam within a bash shell on primus, e.g. optirun bash followed by ‘steam’ in this bash instance. Not ideal, but it works for now until primus is fixed.

Edit:
Fixed minecraft as well, it was a layer 8 error: if you are using a launcher script that alters LD_LIBRARY_PATH (esp. for 64 bit systems to use the right JRE) you need to include the old contents of that variable as well, e.g.

export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/usr/lib64/jvm/foo/bar/blahblahwhatever".