Nvidia optimus without bumblebee

hi

nvidia has started to support optimus.
intel card is used to display and nvidia card to rended

Chapter
Chapter

does somebody tried?

thanks

i saw on ubuntu for a package to use optimus support from nvidia driver
Using Nvidia Graphics Drivers With Initial Optimus Support In Ubuntu 13.10 Got Easier With Nvidia-Prime ~ Web Upd8: Ubuntu / Linux blog

is there a way to do it with suse?

yes. I’m effectively doing it (I say “effectively” only because I’m not using nvidia optimus, but that point is unimportant, because functionally what I’m doing and what the nvidia docs describe are exactly the same thing – using two adapters to provide a fully functional and contiguous desktop under X).

There is nothing particular about that Ubuntu package, other then it removes having to manually set things up. But there is nothing particularly difficult about manually setting it up either.

To be clear, the nvidia driver currently only supports

  1. setting up the discrete nvidia adapter as the primary adapter (and that which renders everything). The nvidia gpu may or may not have any outputs connected to itself, but in any regard, it is the rendering source. The integrated adapter just functions as a dumb output sink (it outputs to a display device (the laptop’s own panel; external monitor,…) what its handed from the nvidia adapter (which is doing all the heavy lifting)). … or
  2. using the integrated adapter as the primary adapter (which renders everything and outputs everything (but only to those outputs connected to the integrated gpu)). The nvidia adapter can, however, be utilized for openCL Cuda type applications … just not for anything related to display wise … and you will not be able to see any output on any connector that is attached to the nvidia gpu

Do note that in either case, both adapters are chugging away (In which case, queue: Feelin Hot Hot Hot!! - YouTube)

I don’t really like the way the nvidia ch.32 documentation describes the setup, but, nonetheless, its still pretty straight forward.

The Ubuntu package alters the DM (lightDM, as that’s what Ubuntu uses) simply by making it use a start up script to run the xrandr commands. This is precisely what I do too. I just happen to use lightDM too, but you can accomplish the same thing with any other DM.

For the general case scenario using OSS drivers, I’ve described the steps most recently here: xrandr 1.4 multi-gpu works! (just ignore the stuff I blab on about glamor)
Also, another related discussion that would be of interest in that case follows here: http://forums.opensuse.org/english/get-technical-help-here/hardware/491400-dell-inspiron-nvidia-2-video-controllers-2.html#post2593042

don’t know if they are a easy way to know if i have a laptop with mux or muxless

in the uefi, i can choose between integrated card (intel) or optimus

chapter 32: ftp://download.nvidia.com/XFree86/Linux-x86/295.20/README/depth30.html said nothing

in another thread, you said

It does not support render offload and cannot be used as an output sink.

what that mean?

if i have a intel card with optimus and is muxless i can’t use nvidia card (proprietary driver) to render on intel card?

sounds like its a muxed system … lenovo laptops, for ex., often feature such

chapter 32: ftp://download.nvidia.com/XFree86/Linux-x86/295.20/README/depth30.html said nothing
I’m not sure why you’ve linked to ch.32

in another thread, you said … what that mean?
refers to some xrandr provider objects stuff that the nvidia driver doesn’t support

  • It does not support render offload
  • render offload is what is known as PRIME … what it would entail is using the integrated gpu as the primary adapter, but get the much more powerful nvidia adapter to render frames for 3D apps (say, like a video game) and to forward that back to the iGPU for output to the display device(s) attached to the iGPU - cannot be used as an output sink
  • this would be for a contiguous desktop across the two adapters … what it would entail is using the integrated gpu as the primary adapter and renderer (including the parts of the desktop that constitute those displays attached to the nvidia adapter) … the appropriate rendered portion of the Desktop is sent to the nividia adapter to output to the display devices attached to the nvidia gpu … the nvidia adapter, in this case, functions just as a dumb output sink

if i have a intel card with optimus and is muxless i can’t use nvidia card (proprietary driver) to render on intel card?
No … irregardless of whether you have mux/muxless, the only xrandr provider object feature the nvidia driver currently supports is exactly that … see pt. 1 in above post … but it does not feature the ability to do PRIME … there is a significant distinction between the two.

previously you said

I don’t really like the way the nvidia ch.32 documentation describes the setup, but, nonetheless, its still pretty straight forward.

Yeah, well that guy doesn’t know what he’s talking about lol!

It must have been a typo – see ch.33: ftp://download.nvidia.com/XFree86/Linux-x86_64/331.17/README/randr14.html

xrandr --listproviders

Providers: number : 2
Provider 0: id: 0x92 cap: 0xb, Source Output, Sink Output, Sink Offload crtcs: 4 outputs: 7 associated providers: 0 name:Intel
Provider 1: id: 0x62 cap: 0x7, Source Output, Sink Output, Source Offload crtcs: 2 outputs: 1 associated providers: 0 name:nouveau

don’t know if theses info allow to said if i have a muxless, or not?

surely it’s muxless, otherwise in bios i could see: intel / nvidia / optimus. i only see intel / optimus.

if i want to select where is started an application, right now i don’t have choice to use bumblebee?

for video card i have

00:02.0 VGA compatible controller: Intel Corporation 3rd Gen Core processor Graphics Controller (rev 09)
01:00.0 VGA compatible controller: NVIDIA Corporation Device 0de8 (rev a1)

i have an hdmi port, so i don’t know if it connected to intel or nvidia card.

i don’t really know the possibility…

  1. intel on laptop screen nvidia render it
  2. intel on laptop screen, nvidia on the hdmi

my big problem is how don’t really know how to install nvidia proprietary driver… can have problem with opengl with the intel one, and don’t know if it’s the same way to do it for case 1 and 2…

No. Offhand I don’t know if there is any way to check for a mux other then checking through the BIOS/uefi

surely it’s muxless, otherwise in bios i could see: intel / nvidia / optimus. i only see intel / optimus.
yes, you’re right…not sure why I thought it was muxed earlier.:\

for video card i have

00:02.0 VGA compatible controller: Intel Corporation 3rd Gen Core processor Graphics Controller (rev 09)
01:00.0 VGA compatible controller: NVIDIA Corporation Device 0de8 (rev a1)

i have an hdmi port, so i don’t know if it connected to intel or nvidia card.

i don’t really know the possibility…
okay, to determine which card/adapter is which, match the pciid from the lspci output to the output of the following:

ls -la /sys/class/drm/card?

Then, armed with that knowledge, you can determine what outputs are attached to which card/adapter via:

 ls /sys/class/drm/*/status | xargs -I {} -i bash -c "echo -n {}: ; cat {}"

my big problem is how don’t really know how to install nvidia proprietary driver… can have problem with opengl with the intel one, and don’t know if it’s the same way to do it for case 1 and 2…
No

  • the OSS GL libs won’t be used – only the nividia supplied gl stuff for case 1 – if you follow the directions of that ch.33.

  • In case 2, the OSS GL stuff is being used because the intel is rendering … nvidia adapter is not being used for graphics – just for offscreen openCL/cuda stuff

  • However, what is it exactly that you’re trying to do?

  • Also, I went back an looked at the link you had provided to that ubuntu package. This may have been a source of confusion. They’ve called it “nvidia-prime” … unfortunately, that’s not “prime” at all – prime is what I described above.

i have a hdmi port and a thunderbolt… it’s the same connector then a mini display port.

First command

ls -la /sys/class/drm/card?
lrwxrwxrwx 1 root root 0 30 oct 07:31 /sys/class/drm/card0 -> ../../devices/pci0000:00/0000:00:02.0/drm/card0
lrwxrwxrwx 1 root root 0 30 oct 07:31 /sys/class/drm/card1 -> ../../devices/pci0000:00/0000:00:01.0/0000:01:00.0/drm/card1


second command

ls /sys/class/drm/*/status | xargs -I {} -i bash -c "echo -n {}: ; cat {}"
/sys/class/drm/card0-DP-1/status:disconnected
/sys/class/drm/card0-DP-2/status:disconnected
/sys/class/drm/card0-HDMI-A-1/status:disconnected
/sys/class/drm/card0-HDMI-A-2/status:disconnected
/sys/class/drm/card0-LVDS-1/status:connected
/sys/class/drm/card0-VGA-1/status:disconnected
/sys/class/drm/card1-VGA-2/status:connected

all seem connected to the intel card, except vga2
i think master is lvds 1, don’t know if i could only run nvidia card…

depend of the possibility of my system.

a list of stuff

i think without bumblebee i can’t render specific app on the nvidia card… could be nice to do it without bumblebee

all my port use intel, so maybe that could be possible to connect a monitor on hdmi another one on display port… maybe there is a way to render on the nvidia in function of the ouput we want
like render only for the hdmi port not the display port…

run only nvidia card

intel card, render on nvidia… but useless if i can run only nvidia card

xrandr --listproviders
Providers: number : 2
Provider 0: id: 0x92 cap: 0xb, Source Output, Sink Output, Sink Offload crtcs: 4 outputs: 7 associated providers: 0 name:Intel
Provider 1: id: 0x62 cap: 0x7, Source Output, Sink Output, Source Offload crtcs: 2 outputs: 1 associated providers: 0 name:nouveau

i have an monitor connected on my hdmi port + i use laptop screen, that work fine.
i get:

/sys/class/drm/card0-VGA-1/status:connected
/sys/class/drm/card1-DP-1/status:disconnected
/sys/class/drm/card1-DP-2/status:disconnected
/sys/class/drm/card1-HDMI-A-1/status:connected
/sys/class/drm/card1-HDMI-A-2/status:disconnected
/sys/class/drm/card1-LVDS-1/status:connected
/sys/class/drm/card1-VGA-2/status:disconnected

i connected a display port to dvi cable to my display port and get

/sys/class/drm/card0-VGA-1/status:connected
/sys/class/drm/card1-DP-1/status:disconnected
/sys/class/drm/card1-DP-2/status:disconnected
/sys/class/drm/card1-HDMI-A-1/status:connected
/sys/class/drm/card1-HDMI-A-2/status:connected
/sys/class/drm/card1-LVDS-1/status:connected
/sys/class/drm/card1-VGA-2/status:disconnected

so it’s seem than dp to dvi was connected to hdmi-a-2.

the problem is the screen on this port is black.

xorg log return me theses line when i try to enable the third monitor

4405.839] (II) intel(0): resizing framebuffer to 4544x1080
4405.839] (II) intel(0): switch to mode 1600x900@60.0 on pipe 0 using LVDS1, position (1920, 0), rotation normal
4405.858] (II) intel(0): switch to mode 1920x1080@60.0 on pipe 1 using HDMI1, position (0, 0), rotation normal
4406.001] (II) intel(0): switch to mode 1024x768@60.0 on pipe 2 using HDMI2, position (3520, 0), rotation normal
4406.001] (EE) intel(0): failed to set mode: Invalid argument

when i switched all monitor to 800x600, that worked fine…

all port seem connected to intel card… so maybe the resolution is a limit of the card?
don’t know if there is a solution to get normal resolution on a monitor or something else?