nvidia optimus setup

Hello all,
I have now installed opensuse 13.2 on my new ASUS GL551JM laptop, and I
have been able to get dual boot running (not in secure mode yet) and I
can boot into opensuse.

I found that I can only boot if I put “nomodeset” at the end of the boot
line in grub when I start up. Since I don’t want to have to do that
every time, I started trying to check it out.

So I went to the practical theory guide for setting up graphics cards,
and because I have the nvidia optimus integrated graphics card, it took
me to the following.

I followed instructions to install proprietary nvidia driver from the
following page:
https://en.opensuse.org/SDB:NVIDIA_Bumblebee

this consisted of adding the X11:Bumblebee respository from the open
build service.

Then making the following commands:


# zypper in nvidia-bumblebee


# systemctl enable dkms
tribetrekGlap:/home/george # zypper in nvidia-bumblebee-32bit
Loading repository data...
Reading installed packages...
Resolving package dependencies...

The following 33 NEW packages are going to be installed:
libdrm2-32bit libdrm_intel1-32bit libdrm_nouveau2-32bit
libdrm_radeon1-32bit libelf1-32bit libexpat1-32bit libffi4-32bit
libgbm1-32bit libLLVM-32bit libncurses5-32bit libpciaccess0-32bit
libwayland-client0-32bit libwayland-server0-32bit
libX11-6-32bit libX11-xcb1-32bit libXau6-32bit libxcb1-32bit
libxcb-dri2-0-32bit libxcb-dri3-0-32bit libxcb-glx0-32bit
libxcb-present0-32bit libxcb-sync1-32bit libxcb-xfixes0-32bit
libXdamage1-32bit libXext6-32bit libXfixes3-32bit
libxshmfence1-32bit libXxf86vm1-32bit Mesa-32bit Mesa-libEGL1-32bit
Mesa-libGL1-32bit Mesa-libglapi0-32bit
nvidia-bumblebee-32bit

33 new packages to install.
Overall download size: 12.6 MiB. Already cached: 0 B  After the
operation, additional 47.8 MiB will be used.
Continue? [y/n/? shows all options] (y): y


After that, the SDB page gives no guidance on what to do next, how to
figure out if your nvidia graphics card is working, etc. So, any ideas
on what I need to do next?

I know that my card is an NVIDIA GeForce GTX 860M 2GB according to the
specs from where I purchased it.

here is the info from my command line:


# hwinfo --gfxcard
10: PCI 02.0: 0300 VGA compatible controller (VGA)
[Created at pci.328]
Unique ID: _Znp.r92QD4vNWR8
SysFS ID: /devices/pci0000:00/0000:00:02.0
SysFS BusID: 0000:00:02.0
Hardware Class: graphics card
Model: "Intel Haswell Integrated Graphics Controller"
Vendor: pci 0x8086 "Intel Corporation"
Device: pci 0x0416 "Haswell Integrated Graphics Controller"
SubVendor: pci 0x1043 "ASUSTeK Computer Inc."
SubDevice: pci 0x185d
Revision: 0x06
Memory Range: 0xf7400000-0xf77fffff (rw,non-prefetchable)
Memory Range: 0xd0000000-0xdfffffff (ro,non-prefetchable)
I/O Ports: 0xf000-0xf03f (rw)
IRQ: 255 (no events)
Module Alias: "pci:v00008086d00000416sv00001043sd0000185Dbc03sc00i00"
Driver Info #0:
Driver Status: i915 is active
Driver Activation Cmd: "modprobe i915"
Config Status: cfg=no, avail=yes, need=no, active=unknown

24: PCI 100.0: 0302 3D controller
[Created at pci.328]
Unique ID: VCu0.UempV7Sqa69
Parent ID: vSkL.FrxKl4Pg9g5
SysFS ID: /devices/pci0000:00/0000:00:01.0/0000:01:00.0
SysFS BusID: 0000:01:00.0
Hardware Class: graphics card
Model: "nVidia 3D controller"
Vendor: pci 0x10de "nVidia Corporation"
Device: pci 0x1392
SubVendor: pci 0x1043 "ASUSTeK Computer Inc."
SubDevice: pci 0x185d
Revision: 0xa2
Driver: "nvidia"
Driver Modules: "nvidia"
Memory Range: 0xf6000000-0xf6ffffff (rw,non-prefetchable)
Memory Range: 0xe0000000-0xefffffff (ro,non-prefetchable)
Memory Range: 0xf0000000-0xf1ffffff (ro,non-prefetchable)
I/O Ports: 0xe000-0xefff (rw)
Memory Range: 0xf7000000-0xf707ffff (ro,non-prefetchable,disabled)
IRQ: 16 (33 events)
Module Alias: "pci:v000010DEd00001392sv00001043sd0000185Dbc03sc02i00"
Driver Info #0:
Driver Status: nouveau is active
Driver Activation Cmd: "modprobe nouveau"
Driver Info #1:
Driver Status: nvidia is active
Driver Activation Cmd: "modprobe nvidia"
Config Status: cfg=no, avail=yes, need=no, active=unknown
Attached to: #9 (PCI bridge)

Primary display adapter: #10

So you see that the os has picked up both graphics cards, the nvidia and
the integrated intel version that are used with optimus.

But beyond that I don’t know what to check.

Ultimately, I want to be able to boot into opensuse without having to
edit and type into the boot line “nomodeset”.

G.O.

You need the bumblebee drivers to use Optimus properly

1 do NOT install the normal NVIDA driver it will totally mess things up.If you do, uninstall any NVIDIA packages before you do anything else

2 follow the instruction at this site exactly do not deviate don’t use anything else you might read on the Internet

https://en.opensuse.org/SDB:NVIDIA_Bumblebee

On 02/12/2015 04:36 PM, gogalthorp wrote:
>
> You need the bumblebee drivers to use Optimus properly
>
> 1 do NOT install the normal NVIDA driver it will totally mess things
> up.If you do, uninstall any NVIDIA packages before you do anything else
>
> 2 follow the instruction at this site exactly do not deviate don’t use
> anything else you might read on the Internet
>
> https://en.opensuse.org/SDB:NVIDIA_Bumblebee
>
>
Excellent, that worked perfectly! Ok, I was able to boot, and I have
also set up the nvidia driver according to the “optional” heading on
that page.

So now how do I control if I want to activate the nvidia driver for some
high media intense application? I did not see any user interface. And
there is no “man” entry for bumblebee.


G.O.
Box #1: 13.1 | KDE 4.12 | AMD Phenom IIX4 | 64 | 16GB
Box #2: 13.1 | KDE 4.12 | AMD Athlon X3 | 64 | 4GB
Laptop #1: 13.1 | KDE 4.12 | Core i7-2620M | 64 | 8GB
Laptop #2: 13.2 | KDE 4.14 | Core i7-4710HQ | 64 | 16GB

It is explained on the bumblebee page I sent you to.

Hint use optirun to start the program you want to run under NVIDIA

“Oh no does that mean I have to type all the commands??”

No it mean you must edit the menu entries and or short cuts of the programs you want to use the NVIDIA driver

put optirun in front of the command

Re read the Bumblebee page and understand it shows examples

On 02/12/2015 09:46 PM, gogalthorp wrote:
>
> It is explained on the bumblebee page I sent you to.
>
> Hint use optirun to start the program you want to run under NVIDIA
>
> “Oh no does that mean I have to type all the commands??”
>
> No it mean you must edit the menu entries and or short cuts of the
> programs you want to use the NVIDIA driver
>
> put optirun in front of the command
>
> Re read the Bumblebee page and understand it shows examples
>
>

Excellent, ok I get it now.

I tried running glxspheres with optirun, with primusrun, and without
anything. When I type “optirun --status” on the command line it tells me
if something is using bumbleed.

Running “glxspheres” looked the same in all 3 cases, except that running
it with optirun just makes it run faster. In all 3 cases it looks like a
sphere is shot toward the center while the rings around the center are
circling. Running it with optirun and primusrun, the status indicates
that bumblebeed is being used.

Is that what I am supposed to see?

Thanks again,


G.O.
Box #1: 13.1 | KDE 4.12 | AMD Phenom IIX4 | 64 | 16GB
Box #2: 13.1 | KDE 4.12 | AMD Athlon X3 | 64 | 4GB
Laptop #1: 13.1 | KDE 4.12 | Core i7-2620M | 64 | 8GB
Laptop #2: 13.2 | KDE 4.14 | Core i7-4710HQ | 64 | 16GB

What you see is unimportant the numbers are. In essence it shows you are 3-D accelerating with NVIDIA not Intel

There is no perceptible difference in the rendering. For games you will get higher frame rates

On 02/13/2015 10:46 AM, gogalthorp wrote:
>
> What you see is unimportant the numbers are. In essence it shows you are
> 3-D accelerating with NVIDIA not Intel
>
> There is no perceptible difference in the rendering. For games you will
> get higher frame rates
>
>
excellent, thanks for the help. Glad I have it working now!


G.O.
Box #1: 13.1 | KDE 4.12 | AMD Phenom IIX4 | 64 | 16GB
Box #2: 13.1 | KDE 4.12 | AMD Athlon X3 | 64 | 4GB
Laptop #1: 13.1 | KDE 4.12 | Core i7-2620M | 64 | 8GB
Laptop #2: 13.2 | KDE 4.14 | Core i7-4710HQ | 64 | 16GB