xorg.conf: dual gpu laptop (intel+nvidia)

I’ve got this old laptop, an Asus N55S (mfg 2011), Intel i7-2670QM (was a pretty high-end machine when new) that has a discrete nVidia graphics adapter in addition to the CPU’s Intel HD 3000 graphics. I can’t get the nVidia to work; it throws “NVIDIA: Failed to initialize the NVIDIA kernel module”.

At the moment, I’m using the Intel GPU, with the internal monitor only. The machine also has an HDMI port, but I am unable to get anything on this. I would like to be able to use this machine with an external monitor on HDMI (dual-head, alongside the internal LCD).

The machine has a VGA port also, but I don’t have a VGA monitor available right now, so I have no idea whether that works (and don’t particularly care).

My guess is that only the discrete nVidia GPU is connected to the HDMI port, and not the Intel GPU. I don’t know whether it is possible to drive both the internal LCD and the HDMI port from the nVidia (I would expect this to be possible), or whether the internal monitor can only be driven by the Intel. This doesn’t really matter to me, though, as long as I can get a usable dual-head setup.

# lspci    # selected lines
00:01.0 PCI bridge: Intel Corporation Xeon E3-1200/2nd Generation Core Processor Family PCI Express Root Port (rev 09)
00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)
01:00.0 VGA compatible controller: NVIDIA Corporation GF116M [GeForce GT 555M/635M] (rev a1)

# xrandr
Screen 0: minimum 8 x 8, current 1366 x 768, maximum 32767 x 32767
LVDS1 connected primary 1366x768+0+0 (normal left inverted right x axis y axis) 344mm x 193mm
   1366x768       60.0*+
   1024x768       60.0  
   800x600        60.3     56.2  
   640x480        59.9  
VGA1 disconnected (normal left inverted right x axis y axis)
VIRTUAL1 disconnected (normal left inverted right x axis y axis)

There are no relevant settings in the BIOS, by the way (e.g., “Select primary monitor”, “Enable HDMI” etc. – nothing of the sort.) The laptop has an [fn]+[f8] keyboard switch to enable/disable the external video port, but this switch appears to do nothing; all the other keyboard hotkeys are working (volume/brightness control etc.) so I don’t think it’s a keyboard problem. But perhaps that button only applies to VGA, not HDMI?

As I understand it, there are three sets of drivers for nVidia GPUs: the free “nv” drivers that only provide basic functionality, the free “nouveau” drivers that support some more fancy features, and the proprietary “nvidia” drivers for full functionality (but not “free as in freedom”). At this stage I don’t particularly care which of these I use, and my most recent attempts have involved the proprietary drivers.

I’ve got the following packages installed from the nvidia repository: x11-video-nvidiaG04, nvidia-glG04, nvidia-computeG04.

I’ll dump my full xorg.conf and Xorg.0.log below. (edit: I truncated some parts of Xorg.0.log.)

Any thoughts on how to get this working, either with these proprietary drivers or with nv/nouveau?

Cheers,
K.

# cat /etc/X11/xorg.conf
Section "ServerLayout"
    Identifier     "20151230 hadg"
    Screen      0  "ScrIntel" 0 0
    Screen      1  "ScrNV" 0 0
    InputDevice    "Mouse0" "CorePointer"
    InputDevice    "Keyboard0" "CoreKeyboard"
EndSection

Section "Files"
    ModulePath      "/usr/lib64/xorg/modules/updates"
    ModulePath      "/usr/lib64/xorg/modules"
    FontPath        "/usr/share/fonts/misc:unscaled"
    FontPath        "/usr/share/fonts/Type1/"
    FontPath        "/usr/share/fonts/100dpi:unscaled"
    FontPath        "/usr/share/fonts/75dpi:unscaled"
    FontPath        "/usr/share/fonts/ghostscript/"
    FontPath        "/usr/share/fonts/cyrillic:unscaled"
    FontPath        "/usr/share/fonts/misc/sgi:unscaled"
    FontPath        "/usr/share/fonts/truetype/"
    FontPath        "built-ins"
EndSection

Section "Module"
    Load           "glx"

# Note:  you probably want this to be the nVidia GLX module, look for
#        /usr/lib64/xorg/modules/extensions/libglx.so which should be
#        symlinked to nvidia/nvidia-libglx.so rather than the default
#        Xorg GLX module.
#        
#        In /var/log/Xorg.0.log look for:
#        (II) Module glx: vendor="NVIDIA Corporation"
#        rather than
#        (II) Module glx: vendor="X.Org Foundation"

EndSection

Section "InputDevice"
    Identifier     "Keyboard0"
    Driver         "kbd"
EndSection

Section "InputDevice"
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/input/mice"
    Option         "ZAxisMapping" "4 5 6 7"
EndSection

Section "Monitor"
    Identifier     "MonIntel"
EndSection

Section "Monitor"
    Identifier     "MonNV"
EndSection

Section "Device"
    Identifier     "CardIntel"
    Driver         "intel"
    BusID          "PCI:0:2:0"
EndSection

Section "Device"
    Identifier     "CardNV"
    Driver         "nvidia"
#   Driver         "nv"
#   Driver         "nouveau"
    BusID          "PCI:1:0:0"
EndSection

Section "Screen"
    Identifier     "ScrIntel"
    Device         "CardIntel"
    Monitor        "MonIntel"
    DefaultDepth    24
EndSection

Section "Screen"
    Identifier     "ScrNV"
    Device         "CardNV"
    Monitor        "MonNV"
    DefaultDepth    24
    SubSection     "Display"
        Depth       24
        Modes      "nvidia-auto-select"
    EndSubSection
EndSection
# cat /var/log/Xorg.0.log
    71.822]
X.Org X Server 1.17.2
Release Date: 2015-06-16
    71.822] X Protocol Version 11, Revision 0
    71.822] Build Operating System: openSUSE SUSE LINUX
    71.822] Current Operating System: Linux asus-i7.banchory 4.1.31-30-default #1 SMP PREEMPT Wed Aug 24 06:20:09 UTC 2016 (de9ddf8) x86_64
    71.822] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-4.1.31-30-default root=UUID=fd238add-8cd6-4795-a9ec-f0d54b07c9b4 resume=/dev/sda5 quiet showopts
    71.822] Build Date: 07 March 2016  08:22:28AM
    71.822]  
    71.822] Current version of pixman: 0.32.6
    71.822]    Before reporting problems, check http://wiki.x.org
        to make sure that you have the latest version.
    71.822] Markers: (--) probed, (**) from config file, (==) default setting,
        (++) from command line, (!!) notice, (II) informational,
        (WW) warning, (EE) error, (NI) not implemented, (??) unknown.
    71.823] (==) Log file: "/var/log/Xorg.0.log", Time: Sat Sep 24 19:29:01 2016
    71.823] (==) Using config file: "/etc/X11/xorg.conf"
    71.823] (==) Using config directory: "/etc/X11/xorg.conf.d"
    71.823] (==) Using system config directory "/usr/share/X11/xorg.conf.d"
    71.823] (==) ServerLayout "20151230 hadg"
    71.823] (**) |-->Screen "ScrIntel" (0)
    71.823] (**) |   |-->Monitor "MonIntel"
    71.823] (**) |   |-->Device "CardIntel"
    71.823] (**) |-->Screen "ScrNV" (1)
    71.823] (**) |   |-->Monitor "MonNV"
    71.823] (**) |   |-->Device "CardNV"
    71.823] (**) |-->Input Device "Mouse0"
    71.823] (**) |-->Input Device "Keyboard0"

  ( ... )

    71.824] (WW) Hotplugging is on, devices using drivers 'kbd', 'mouse' or 'vmmouse' will be disabled.
    71.824] (WW) Disabling Mouse0
    71.824] (WW) Disabling Keyboard0

  ( ... )

    71.826] (--) PCI:*(0:0:2:0) 8086:0116:1043:2050 rev 9, Mem @ 0xdc400000/4194304, 0xb0000000/268435456, I/O @ 0x0000e000/64
    71.826] (--) PCI: (0:1:0:0) 10de:1247:1043:2050 rev 161, Mem @ 0xda000000/33554432, 0xc0000000/268435456, 0xd0000000/67108864, I/O @ 0x0000d000/128, BIOS @ 0x????????/524288
    71.826] (II) "glx" will be loaded. This was enabled by default and also specified in the config file.
    71.826] (II) LoadModule: "glx"
    71.856] (II) Loading /usr/lib64/xorg/modules/extensions/libglx.so
    71.857] (II) Module glx: vendor="X.Org Foundation"
    71.857]    compiled for 1.17.2, module version = 1.0.0
    71.857]    ABI class: X.Org Server Extension, version 9.0
    71.857] (==) AIGLX enabled
    71.857] (II) LoadModule: "intel"
    71.857] (II) Loading /usr/lib64/xorg/modules/drivers/intel_drv.so
    71.858] (II) Module intel: vendor="X.Org Foundation"
    71.858]    compiled for 1.17.2, module version = 2.99.917
    71.858]    Module class: X.Org Video Driver
    71.858]    ABI class: X.Org Video Driver, version 19.0
    71.858] (II) LoadModule: "nvidia"
    71.858] (II) Loading /usr/lib64/xorg/modules/drivers/nvidia_drv.so
    71.858] (II) Module nvidia: vendor="NVIDIA Corporation"
    71.858]    compiled for 4.0.2, module version = 1.0.0
    71.858]    Module class: X.Org Video Driver
    71.858] (II) intel: Driver for Intel(R) Integrated Graphics Chipsets:
        i810, i810-dc100, i810e, i815, i830M, 845G, 854, 852GM/855GM, 865G,
        915G, E7221 (i915), 915GM, 945G, 945GM, 945GME, Pineview GM,
        Pineview G, 965G, G35, 965Q, 946GZ, 965GM, 965GME/GLE, G33, Q35, Q33,
        GM45, 4 Series, G45/G43, Q45/Q43, G41, B43
    71.858] (II) intel: Driver for Intel(R) HD Graphics: 2000-6000
    71.858] (II) intel: Driver for Intel(R) Iris(TM) Graphics: 5100, 6100
    71.858] (II) intel: Driver for Intel(R) Iris(TM) Pro Graphics: 5200, 6200, P6300
    71.858] (II) NVIDIA dlloader X Driver  367.44  Wed Aug 17 21:28:13 PDT 2016
    71.858] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
    71.858] (II) intel(0): Using Kernel Mode Setting driver: i915, version 1.6.0 20150327
    71.859] (II) Loading sub module "fb"
    71.859] (II) LoadModule: "fb"
    71.859] (II) Loading /usr/lib64/xorg/modules/libfb.so
    71.859] (II) Module fb: vendor="X.Org Foundation"
    71.859]    compiled for 1.17.2, module version = 1.0.0
    71.859]    ABI class: X.Org ANSI C Emulation, version 0.4
    71.859] (II) Loading sub module "wfb"
    71.859] (II) LoadModule: "wfb"
    71.867] (II) Loading /usr/lib64/xorg/modules/libwfb.so
    71.867] (II) Module wfb: vendor="X.Org Foundation"
    71.867]    compiled for 1.17.2, module version = 1.0.0
    71.867]    ABI class: X.Org ANSI C Emulation, version 0.4
    71.867] (II) Loading sub module "ramdac"
    71.867] (II) LoadModule: "ramdac"
    71.867] (II) Module "ramdac" already built-in
    71.911] (EE) NVIDIA: Failed to initialize the NVIDIA kernel module. Please see the
    71.911] (EE) NVIDIA:     system's kernel log for additional error messages and
    71.911] (EE) NVIDIA:     consult the NVIDIA README for details.
    71.911] (--) intel(0): Integrated Graphics Chipset: Intel(R) HD Graphics 3000
    71.911] (--) intel(0): CPU: x86-64, sse2, sse3, ssse3, sse4.1, sse4.2, avx
    71.911] (II) intel(0): Creating default Display subsection in Screen section
        "ScrIntel" for depth/fbbpp 24/32
    71.912] (**) intel(0): Depth 24, (--) framebuffer bpp 32
    71.912] (==) intel(0): RGB weight 888
    71.912] (==) intel(0): Default visual is TrueColor
    71.912] (II) intel(0): Output LVDS1 using monitor section MonIntel
    71.912] (--) intel(0): Found backlight control interface acpi_video0 (type 'firmware') for output LVDS1
    71.912] (II) intel(0): Enabled output LVDS1
    71.912] (II) intel(0): Output VGA1 has no monitor section
    71.912] (II) intel(0): Enabled output VGA1
    71.912] (--) intel(0): Using a maximum size of 256x256 for hardware cursors
    71.912] (II) intel(0): Output VIRTUAL1 has no monitor section
    71.912] (II) intel(0): Enabled output VIRTUAL1
    71.912] (--) intel(0): Output LVDS1 using initial mode 1366x768 on pipe 0
    71.912] (==) intel(0): TearFree disabled
    71.912] (==) intel(0): DPI set to (96, 96)
    71.912] (II) Loading sub module "dri2"
    71.912] (II) LoadModule: "dri2"
    71.912] (II) Module "dri2" already built-in
    71.912] (II) Loading sub module "present"
    71.912] (II) LoadModule: "present"
    71.912] (II) Module "present" already built-in
    71.912] (==) Depth 24 pixmap format is 32 bpp
    71.912] (II) intel(0): SNA initialized with Sandybridge (gen6, gt2) backend
    71.912] (==) intel(0): Backing store enabled
    71.912] (==) intel(0): Silken mouse enabled
    71.912] (II) intel(0): HW Cursor enabled
    71.912] (II) intel(0): RandR 1.2 enabled, ignore the following RandR disabled message.
    71.912] (==) intel(0): DPMS enabled
    71.913] (==) intel(0): display hotplug detection enabled
    71.913] (II) intel(0): [DRI2] Setup complete
    71.913] (II) intel(0): [DRI2]   DRI driver: i965
    71.913] (II) intel(0): [DRI2]   VDPAU driver: i965
    71.913] (II) intel(0): direct rendering: DRI2 enabled
    71.913] (II) intel(0): hardware support for Present enabled
    71.913] (--) RandR disabled

  ( ... )

    90.575] (II) intel(0): EDID vendor "CMO", prod id 5543
    90.584] (II) intel(0): Printing DDC gathered Modelines:
    90.584] (II) intel(0): Modeline "1366x768"x0.0   69.30  1366 1382 1416 1466  768 770 776 788 -hsync -vsync (47.3 kHz eP)

Can you turn off the Intel GPU in the BIOS???

In any case this is probably an Optimus based machine and thus you need bumblebee.

First off remove the NVIDIA drive you installed you need a special nvidia-bumblebee driver not the normal one.

instruction for bumblebee here - follow exactly

https://en.opensuse.org/SDB:NVIDIA_Bumblebee

You must remove the regular drive before proceeding with Bumblebee install

Thanks gogalthorp for the help & the fast response!

Nope. Can’t find any graphics settings in the BIOS whatsoever.

Now we’re getting somewhere. I removed the regular nvidia drivers, followed the instructions on that page (including the sections “optional” and “Problems with GT600M/GT700M series cards”), and I now seem to have Bumblebee working: I can boot the laptop normally (KDE starts on Intel graphics), and from the command line I can call:

~> **optirun glxspheres**
Polygons in scene: 62464 (61 spheres * 1024 polys/spheres)
Visual ID of window: 0x20
Context is Direct
OpenGL Renderer: GeForce GT 555M/PCIe/SSE2
174.091947 frames/sec - 148.737196 Mpixels/sec
~> **primusrun glxspheres**
Polygons in scene: 62464 (61 spheres * 1024 polys/spheres)
Visual ID of window: 0xa5
Context is Direct
OpenGL Renderer: GeForce GT 555M/PCIe/SSE2
61.939697 frames/sec - 52.918799 Mpixels/sec

Both of these work: I see the glxspheres demo in a window on my laptop’s built-in LCD panel.

However, I still don’t get anything on the external HDMI monitor. How can I do that?

It seems that the main scenario for which Bumblebee was intended is to get high-performance graphics for a single application (e.g., gaming on Steam) within the laptop’s own screen. What I would want, ideally, is to get my whole KDE desktop to use both screens.

https://forums.opensuse.org/showthread.php/512260-Leap-42-1-Optimus-system-with-nvidia-prime-instead-of-bumblebee

Note using NVIDIA 100% will reduce battery life since NVIDIA uses more juice

That’s the essence of the “Optimus” architecture, use the discrete GPU only when needed.
Unless you have unusual HW, all your video outputs are wired to the Intel chip, so extending the desktop over to the HDMI monitor has nothing to do with NVIDIA.
Here on Gnome the default is having the two monitors side-by-side, other combinations being cycled by pressing a key combination (fn+F8 on my laptop).
On KDE look for “Display and Monitor” in System Settings to configure as you like.
If you like to permanently use a few applications with NVIDIA graphics you can modify their respective /usr/share/applications/xxx.desktop launchers by prepending “primusrun” or “optirun” to the “Exec=xxx %u” line, like “Exec=primusrun xxx %u”.

Thanks to both for your help.

Not an issue as it’s basically in use as a desktop machine, always plugged in. I can hear the fan spinning up when the nVidia is active, though; if I put my coffee on the left hand side of the laptop, the exhaust fan keeps it warm. :slight_smile:

This surprises me; while running on Intel only, previously, I have never been able to get anything out of the HDMI. On the KDE display settings, only the laptop screen shows (I have configured dual-head setups before, in the way you describe, the settings screen lets you choose the relative position of the monitors etc., but on this machine the external monitor simply does not appear in the settings screen at all).

I’ve been reading on the bumblebee wiki as well, it seems I may not need the bumblebee drivers but can use the bbswitch to turn the nVidia GPU on and off. I have run out of time for this today, so maybe next weekend.

Thanks again for your help; I will post back if I get anywhere.

Cheers!
K.

Intel HDMI can be tricky but it can work but you need to prod it a bit. :open_mouth:

Quoting from the ASUS N55SL manual:

LCD/Monitor Icons (F8): Toggles between the Notebook
PC’s LCD display and an external monitor in this series:
LCD Only -> CRT Only (External Monitor) -> LCD + CRT
Clone -> LCD + CRT Extend. (This function does not
work in 256 Colors, select High Color in Display Property
Settings.) NOTE: Must connect an external monitor
“before” booting up.

Good luck for next weekend :wink:

Ok, some exciting progress today: I have now got the external monitor working on HDMI, using the nVidia GPU with the nouveau driver.

What I still don’t have is dual-head: when the laptop boots, it shows GRUB in the internal monitor, then X starts in the HDMI monitor and the internal one freezes. When I key [ctrl]+[alt]+[f2] for tty2, the HDMI monitor freezes and I get my console in the internal monitor. This is very usable, though, and a major gain in desktop real-estate (1366x768 → 1920x1200).

I’ll trace back my steps now and describe what I did, in case anyone else finds this thread while fighting with an Optimus device:

I’m not actually using bumblebee now, but I am using bbswitch, which came with Bumblebee. The catch with Optimus is that the nVidia GPU is switched off by default; it then switches on when you call a specific application that needs the extra graphics power (i.e., to play Warcr— eh, I mean, when you need to work on that 3D CAD project!), and then switches off again when you’re done. This is clever because it saves power.

In my case, I just wanted the nVidia GPU to be switched on all the time and run all of KDE. The reason why this failed previously (before gogalthorp pointed me to Optimus and Bumblebee) was simply because the nVidia GPU was switched off. I needed to switch it on.

So: install bbswitch (from the Bumblebee repository). This is a kernel module that takes care of switching the nVidia GPU on/off. The packages you need are bbswitch, bbswitch-kmp-default, and dkms; don’t bother with bumblebee or any of the nvidia-* packages. Read more here about bbswitch; that page will tell you how to switch the card on and off, and how to check its status.

Final check:

# dmesg | grep bbswitch
    2.922066] bbswitch: version 0.8
    2.922078] bbswitch: Found integrated VGA device 0000:00:02.0: \_SB_.PCI0.GFX0
    2.922087] bbswitch: Found discrete VGA device 0000:01:00.0: \_SB_.PCI0.PEG0.GFX0
    2.922544] bbswitch: detected an Optimus _DSM function
    2.922701] bbswitch: Succesfully loaded. Discrete card 0000:01:00.0 is on

Great; now that we’ve got bbswitch installed and have persuaded it to turn the nVidia GPU on, we can move on to drivers and xorg.conf.

I went with nouveau, there are good tutorials out there on installing nouveau. Remember to check your /etc/modprobe.d/ for any file with a line that says “blacklist nouveau”; delete that (or comment it out with a #).

# cat /etc/modprobe.d/* | grep nouveau
#blacklist nouveau

Then load the driver (doing this from a graphical console may crash your X session; no big deal):

# modprobe nouveau

And then it’s just editing /etc/X11/xorg.conf, which may end up looking something like this:

Section "ServerLayout"
    Identifier     "20160927 hadg"
#   Screen      0  "ScrIntel" 0 0
    Screen      0  "ScrNV" 0 0
    InputDevice    "Mouse0" "CorePointer"
    InputDevice    "Keyboard0" "CoreKeyboard"
EndSection

Section "Files"
    ModulePath      "/usr/lib64/xorg/modules/updates"
    ModulePath      "/usr/lib64/xorg/modules"
    FontPath        "/usr/share/fonts/misc:unscaled"
    FontPath        "/usr/share/fonts/Type1/"
    FontPath        "/usr/share/fonts/100dpi:unscaled"
    FontPath        "/usr/share/fonts/75dpi:unscaled"
    FontPath        "/usr/share/fonts/ghostscript/"
    FontPath        "/usr/share/fonts/cyrillic:unscaled"
    FontPath        "/usr/share/fonts/misc/sgi:unscaled"
    FontPath        "/usr/share/fonts/truetype/"
    FontPath        "built-ins"
EndSection

Section "Module"
    Load           "glx"

# Note:  you probably want this to be the nVidia GLX module, look for
#        /usr/lib64/xorg/modules/extensions/libglx.so which should be
#        symlinked to nvidia/nvidia-libglx.so rather than the default
#        Xorg GLX module.
#        
#        In /var/log/Xorg.0.log look for:
#        (II) Module glx: vendor="NVIDIA Corporation"
#        rather than
#        (II) Module glx: vendor="X.Org Foundation"

EndSection

Section "InputDevice"
    Identifier     "Keyboard0"
    Driver         "kbd"
EndSection

Section "InputDevice"
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/input/mice"
    Option         "ZAxisMapping" "4 5 6 7"
EndSection

#Section "Monitor"
#    Identifier     "MonIntel"
#EndSection

Section "Monitor"
    Identifier     "MonNV"
EndSection

#Section "Device"
#    Identifier     "CardIntel"
#    Driver         "intel"
#    BusID          "PCI:0:2:0"
#EndSection

Section "Device"
    Identifier     "CardNV"
#   Driver         "nvidia"
#   Driver         "nv"
    Driver         "nouveau"
    BusID          "PCI:1:0:0"
EndSection

#Section "Screen"
#    Identifier     "ScrIntel"
#    Device         "CardIntel"
#    Monitor        "MonIntel"
#    DefaultDepth    24
#EndSection

Section "Screen"
    Identifier     "ScrNV"
    Device         "CardNV"
    Monitor        "MonNV"
    DefaultDepth    24
    SubSection     "Display"
        Depth       24
#       Modes      "nvidia-auto-select"
    EndSubSection
EndSection

Notice I commented out all the intel stuff, so it’s just the nVidia GPU now, with the nouveau driver selected.

I still don’t have dual-head; I tried getting this on-the-fly with either KDE’s configuration tools or xrandr on the command line, but no luck, so it looks like I’ll need to add another monitor section to my xorg.conf. I also don’t have the right resolution on my main monitor yet, it’s running at 1920x1200 while it’s capable of 2560x1440 (QHD), that may be an EDID problem so I may have to add a modeline in xorg.conf by hand instead. Haven’t looked at this in detail yet.

Anyway, I hope these notes may be helpful to someone else with an Optimus machine out there.

Thanks to everyone for your input, it’s very much appreciated!

Cheers,
K.

No, that’s not correct. Its really vendor specific which way the outputs get wired. Many older laptop models favoured having the hdmi or dp outputs attached to the nvidia adapter – the assumption being that the user would want the power savings (via the intel adapter) when using the laptop’s own screen/panel and more powerful performance (from the nvidia adapter) when attaching the laptop to an external display device.

That would be mine as well given the evidence you have provided:

and:

Previously, you had the intel adapter running Screen 0 (<-- the protocol Screen in the X Display Server). You can’t have two adapters associated with the same Screen. You can see in the xrandr output that no hdmi is listed.

Now, with your xorg user configurations, you have the nvidia (nouveau driven) adapter running Screen 0 … (as an aside, the intel adapter would now be associated with Screen 1, but is unaccesible in X until you configure it.)

When you boot up, grub and the initial linux boot stuff will show up on the laptop’s panel, because at that point, it is being driven by the intel kernel driver (specifically, its embedded framebuffer driver, inteldrmfb … as another aside, the nvidia adapter likely also gets a framebuffer assigned to it, fb1, drivern by the nouveau kernel driver, and more specifically, the embedded nouveaudrmfb…but you will only see console messages directed to fb0 (i.e. that driven by the intel adapter)). Once X starts, because of your modifications, the system switches over to the nvidia adapter and you will only get output on the hdmi (and whatever other outputs may be attached to it).

When you switch over to console (which is framebuffer based), the intel adapter has retained fb0, so console output is directed to it.

You should be able to confirm the topology with the output from the following command

cd /sys/class/drm && ls -la

It is

or whether the internal monitor can only be driven by the Intel. This doesn’t really matter to me, though, as long as I can get a usable dual-head setup.
Well, the nvidia adapter is unlikely to be able to output directly to the laptop’s panel, but it can be made to be the renderer and then send that to the intel adatper to put on (i.e. display upon) the panel.

The most effective solution to what you’re seeking (a dual head setup with a contiguous desktop) is to use xrandr provideroutputsource

But first check that the hdmi is indeed wired to the nvidia adapter

I solved it this way some time ago.
https://forums.opensuse.org/showthread.php/497341-Howto-use-Nvidia-native-on-an-external-display-on-an-Optimus-system
I never had to use bbswitch, but this was before the kernel could power off the nvidia card on boot, and if you only will use it in nvidia mode you don’t need an extra instance. Sound also worked over HDMI.