Driving dual monitors with CPU integrated graphics

I am choosing components for a business, non-gaming desktop PC, with dual, identical NEC monitors (extended display, not cloned). The native OS will be the current version of openSuSE Linux with a virtual machine (VirtualBox) for Windows. Typical applications would be word processing, pdf generation, Internet browsing, and image rendering (Photoshop, GIMP); no video rendering.

Until now, I have used discrete nVidia-based graphics cards with dual DVI-I outputs, each monitor set to a resolution of 1920 x 1200. These cards have worked well with openSuSE, requiring very little in the way of configuration, especially in recent years.

Nevertheless, given the performance advances in CPUs, I am exploring CPU integrated graphics offered in AMD Ryzen 5 and Intel Core i3 and i5 series microprocessors as an alternative to a discrete card. In addition to DVI-I inputs, the monitors have one HDMI input and one DisplayPort input (daisy chaining is not an option), and the motherboards I am considering have HDMI and DisplayPort outputs, apparently a fairly standard configuration.

One concern I have is whether running one monitor from an HDMI output and the other from a DisplayPort will afford the same uniform video signals to the two monitors (see, e.g., https://forums.opensuse.org/showthread.php/531742-KDE-and-two-displays-with-diff-resolutions?highlight=integrated+graphics). Some claim that a port splitter may be employed, but others run into performance or configuration difficulties with such devices.

Since I have very little experience with integrated graphics, HDMI, and DisplayPort, the effectiveness of these options appears uncertain. Although a discrete card adds expense, increases power consumption, and generates additional heat, it works and works well. Perhaps that’s a more sensible way to proceed.

I am interested in your experience with integrated graphics and dual monitors, and welcome your comments and suggestions.

Hi
I’m running three 1920x1080 screens (Tumbleweed) off the Intel Xeon E3-1245 V2 CPU/GPU, one is display port as well as audio to screen, one DVI-D to HDMI and the other DVD-I (adapter converting to HDMI) and another adapter converting to VGA this screen is used via an HDMI switch for my virtual machines [WinX, Leap, Tumbleweed, SLES and SLED] (it will switch to HDMI soon so have all three monitors the same). No issues performance wise here, all works a treat, I had to set the default sound output so it uses the display port rather than external speakers, that was probably the only thing that took some research to configure and work on a reboot.


 pinxi -GSxxz
System:    Kernel: 5.8.7-1-default x86_64 bits: 64 compiler: clang v: 11.0.0 Desktop: GNOME 3.36.6 tk: GTK 3.24.22 
           wm: gnome-shell dm: GDM Distro: openSUSE Tumbleweed 20200919 
Graphics:  Device-1: Intel Xeon E3-1200 v2/3rd Gen Core processor Graphics driver: i915 v: kernel bus ID: 00:02.0 
           chip ID: 8086:016a 
           Device-2: NVIDIA GK208B [GeForce GT 710] vendor: ZOTAC driver: nvidia v: 450.66 bus ID: 02:00.0 chip ID: 10de:128b 
           Device-3: NVIDIA GK208B [GeForce GT 710] vendor: ZOTAC driver: nvidia v: 450.66 bus ID: 05:00.0 chip ID: 10de:128b 
           Device-4: NVIDIA GK208B [GeForce GT 710] vendor: ZOTAC driver: nvidia v: 450.66 bus ID: 06:00.0 chip ID: 10de:128b 
           Display: x11 server: X.Org 1.20.9 compositor: gnome-shell driver: modesetting,nvidia unloaded: fbdev,vesa 
           alternate: intel,nouveau,nv resolution: 1: 1920x1080~60Hz 2: 1920x1080~60Hz 3: 1920x1080~60Hz s-dpi: 96 
           OpenGL: renderer: Mesa DRI Intel HD Graphics P4000 (IVB GT2) v: 4.2 Mesa 20.1.7 compat-v: 3.0 direct render: Yes

You will also note the nvidia cards, one for the VM’s when in use, these are cheap low power cards and passive cooling, I just use the cuda cores…

I would check the CPU specs if it has integrated graphics to see what is possible eg screens it can power and total dimensions. If I was going for a newer setup, then would look at the AMD line, cores and ram are always good :wink:

Thank you. I would be relying solely on the integrated graphics for the entire machine (both the native openSuSE OS and the virtual machine) - no discrete graphics card at this time.

At this moment, I have not been able to locate AMD or Intel documentation for their respective current processors (I’m still looking). Third-party sites appear to state that they will support my 1920 x 1200 resolution, but I need to confirm that.

And, yes, AMD is something I will consider along side with Intel.

You mentioned adapters in your post:

one DVI-D to HDMI and the other DVD-I (adapter converting to HDMI) and another adapter converting to VGA
Are those PCIe devices or external cable-type adapters? Passive or active?

You may use your Nvidia graphics card if they are still supported (even with APU).

HDMI and DP supports 1920x1200 at 60+ Hz from theirs creation.

IMHO AMD is a preferable solution right now.

With AMD APU I expect no troubles with dual monitors.

But there are some restrictions with chipsets:

  1. X570 costs too much.
  2. B550 and A520 provides official support only for Renoir APUs (may support older CPUs unofficial). IDK about support of Renoir APUs in Leap 15.2, AMD’s Picasso works.

So maybe B450 or X470 are preferable.
Chipsets A320/B350/X370 are also usable.

Check for graphics ports and its capabilities on manufacture’s websites.

Hi
Passive cable-type adapters, DVI-D and DVI-I are different, hence the adapters…

Hi
Something like this;
https://www.amd.com/en/products/apu/amd-ryzen-3-4300g
https://ark.intel.com/content/www/us/en/ark/products/134898/intel-core-i5-9400-processor-9m-cache-up-to-4-10-ghz.html

Then it will be dependent on the motherboard you choose as to what’s available… need to find one with a couple on integrated NIC’s or at least PCIX1 for an external card…

Malcolm -

Thanks for the info on the adapters. And the spec sheets.

Unless I’m missing something, AMD doesn’t provide specific resolutions. But ASUS does (and I gather other manufacturers do as well):

Integrated Graphics in the 2nd and 1st Gen AMD Ryzen™ with Radeon™ Vega Graphics /Athlon™ with Radeon™ Vega Graphics /7th Generation A-Series APU Multi-VGA output support : HDMI/DisplayPort ports - Supports HDMI 1.4b with max. resolution 4096 x 2160 @ 24 Hz / 2560 x 1600 @ 60 Hz - Supports DisplayPort 1.2 with max. resolution 4096 x 2160 @ 60 Hz Maximum shared memory of 2048 MB (for iGPU exclusively)
Source: https://www.asus.com/us/Motherboards/PRIME-X470-PRO/specifications/

What did you mean by the following?

need to find one [motherboard] with a couple on integrated NIC’s or at least PCIX1 for an external card…

Hi
For running virtual machines on separate interfaces if needed :wink: Or have a PCIX1 slot at least if your wanting additional items going forward.

Thanks. Do I correctly understand your response to mean that a separate interface is not necessary to run in the bridged mode unless required for some other purpose? I.e., both the host and the guest can share a single interface?

Hi
That is the case, can share, I don’t with my machines, separate bridge to isolate from the system, maybe a quirk with qemu, but always been my preference as get full speed across the network.

Thank you. I think I posed the bridging question in the wrong thread - my apologies (trying to do too many things at once).

Two monitors with AMD IGP graphics have been supported at least 5 years. Just a matter of hours ago I put one such, resurrected from dead by a drenching in contact cleaner to remove cigarette smoke residue, online:

# rpm -qa | grep xf86-v
xf86-video-fbdev-0.5.0-lp151.1.2.x86_64
xf86-video-amdgpu-18.1.0-lp151.1.3.x86_64
xf86-video-vesa-2.4.0-lp151.2.3.x86_64
# xrandr --listproviders
Providers: number : 1
Provider 0: id: 0x46; cap: 0xf (Source Output, Sink Output, Source Offload, Sink Offload); crtcs: 4; outputs: 3; associated providers: 0; name: modesetting
    output HDMI-1
    output DP-1
    output VGA-1
# grep -v ^\# /etc/X11/xinit/xinitrc.d/setup | grep xrandr
xrandr --dpi 120 --output DP-1 --mode 2560x1440 --primary --output HDMI-1 --mode 2560x1080 --above DP-1 #
# xrandr | egrep 'onnect|creen|\*' | grep -v disconn | sort -r
**Screen** 0: minimum 320 x 200, current **2560 x 2520**, maximum 16384 x 16384
**HDMI-1** connected **2560x1080**+0+0 (normal left inverted right x axis y axis) 673mm x 284mm
**DP-1** connected primary **2560x1440**+0+1080 (normal left inverted right x axis y axis) 598mm x 336mm
   2560x1440     59.95*+  74.92
   2560x1080     60.00*+
# inxi -V | head -n1
inxi 3.1.06-00 (2020-08-17)
# inxi -GMSay
System:
  Host: asa88 Kernel: 4.12.14-lp151.28.67-default x86_64 bits: 64
  compiler: gcc v: 7.5.0
  parameters: BOOT_IMAGE=/boot/vmlinuz noresume ipv6.disable=1 net.ifnames=0
  mitigations=auto consoleblank=0 video=1440x900@60 5
  Desktop: Trinity R14.0.8 tk: Qt 3.5.0 info: kicker wm: Twin 3.0 dm: TDM
  Distro: **openSUSE Leap 15.1**
Machine:
  Type: Desktop **Mobo: ASUS**TeK model: A88X-PRO v: Rev X.0x
  serial: 140323952800121 UEFI: American Megatrends v: 2603 date: 03/10/2016
Graphics:
  Device-1: **AMD Kaveri [Radeon R7 Graphics]** vendor: ASUSTeK driver: radeon
  v: kernel alternate: amdgpu bus ID: 00:01.0 chip ID: 1002:130f
  Display: **server: X.Org 1.20.3 driver: modesetting** unloaded: fbdev,vesa
  alternate: ati display ID: :0 screens: 1
  **Screen-1**: 0 s-res: **2560x2520** s-dpi: 120 s-size: 541x533mm (21.3x21.0")
  s-diag: 759mm (29.9")
  **Monitor-1: HDMI-1 res: 2560x1080** hz: 60 dpi: 97 size: 673x284mm (26.5x11.2")
  diag: 730mm (28.8")
  **Monitor-2: DP-1 res: 2560x1440** hz: 60 dpi: 109 size: 598x336mm (23.5x13.2")
  diag: 686mm (27")
  OpenGL:
  renderer: AMD KAVERI (DRM 2.50.0 4.12.14-lp151.28.67-default LLVM 7.0.1)
  v: 4.5 Mesa 18.3.2 direct render: Yes

Note it’s here configured to use the upstream default DDX, modesetting, rather than the amdgpu DDX, which I did try first. The amdgpu DDX (as does the intel DDX) results in non-uniform CRTC names, which confuses my global method of manual display setup via xrandr script. No behavioral desktop differences between the two were evident.

This is 3 displays from a 3 year old Intel PC’s IGP:

# rpm -qa | grep xf86-v
xf86-video-vesa-2.4.0-lp151.2.3.x86_64
xf86-video-fbdev-0.5.0-lp151.1.2.x86_64
# xrandr --listproviders
Providers: number : 1
Provider 0: id: 0x47; cap: 0xf (Source Output, Sink Output, Source Offload, Sink Offload); crtcs: 3; outputs: 5; associated providers: 0; name: modesetting
    output HDMI-1	# HDMI actually
    output HDMI-2	# DVI actually
    output DP-1		# DisplayPort
    output HDMI-3	# non-existent output, or dummy
    output DP-2		# VGA actually
# grep -v ^\# /etc/X11/xinit/xinitrc.d/setup | grep xrandr
xrandr --dpi 120 --output DP-1 --mode 2560x1440 --primary --output HDMI-1 --mode 2560x1080 --above DP-1 --output HDMI-2 --mode 1920x1200 --above HDMI-1 #
# xrandr | egrep 'onnect|creen|\*' | grep -v disconn | sort -r
**Screen 0**: minimum 320 x 200, **current 2560 x 3720**, maximum 8192 x 8192
**HDMI-2** connected **1920x1200**+0+0 (normal left inverted right x axis y axis) 519mm x 324mm # this is actually a DVI port connection
**HDMI-1** connected **2560x1080**+0+1200 (normal left inverted right x axis y axis) 673mm x 284mm
**DP-1** connected primary **2560x1440**+0+2280 (normal left inverted right x axis y axis) 598mm x 336mm
   2560x1440     59.95*+  74.92
   2560x1080     60.00*+
   1920x1200     59.95*+
# inxi -V | head -n1
inxi 3.1.06-00 (2020-08-17)
# inxi -GMSay
System:
  Host: gb250 Kernel: 4.12.14-lp151.28.59-default x86_64 bits: 64
  compiler: gcc v: 7.5.0
  parameters: BOOT_IMAGE=/boot/vmlinuz noresume ipv6.disable=1 net.ifnames=0
  mitigations=auto consoleblank=0 video=1440x900@60 5
  Desktop: Trinity R14.0.8 tk: Qt 3.5.0 info: kicker wm: Twin 3.0 dm: TDM
  Distro: **openSUSE Leap 15.1**
Machine:
  Type: Desktop System: Gigabyte product: B250M-D3H v: N/A serial: N/A
  **Mobo**: **Gigabyte** model: B250M-D3H-CF v: x.x serial: N/A
  UEFI: American Megatrends v: F10 date: 12/14/2018
Graphics:
  Device-1: **Intel HD Graphics 630** vendor: Gigabyte driver: i915 v: kernel
  bus ID: 00:02.0 chip ID: 8086:5912
  Display: **server: X.Org 1.20.3 driver: modesetting** unloaded: fbdev,vesa
  alternate: intel display ID: :0 screens: 1
  **Screen-1**: 0 s-res: **2560x3720** s-dpi: 120 s-size: 541x787mm (21.3x31.0")
  s-diag: 955mm (37.6")
  **Monitor-1: HDMI-1 res: 2560x1080** hz: 60 dpi: 97 size: 673x284mm (26.5x11.2")
  diag: 730mm (28.8")
  **Monitor-2: HDMI-2 res: 1920x1200** hz: 60 dpi: 94 size: 519x324mm (20.4x12.8") # DVI, not HDMI
  diag: 612mm (24.1")
  **Monitor-3: DP-1 res: 2560x1440** hz: 60 dpi: 109 size: 598x336mm (23.5x13.2")
  diag: 686mm (27")
  OpenGL: renderer: Mesa DRI Intel HD Graphics 630 (Kaby Lake GT2)
  v: 4.5 Mesa 18.3.2 compat-v: 3.0 direct render: Yes

This also is set to use the very same upstream default DDX, modesetting, instead of the Intel specific DDX.

On neither PC is xorg-x11-driver-video installed.

Check out this primer regarding X driver naming, configuration and requirements.

mrmazda -

Thank you for the response and terminal outputs.

If I am reading this correctly, in both cases (2 and 3 monitors, respectively), the monitors in each setup are running at different resolutions:


  **Screen-1**: 0 s-res: **2560x2520** s-dpi: 120 s-size: 541x533mm (21.3x21.0")
  s-diag: 759mm (29.9")
  **Monitor-1: HDMI-1 res: 2560x1080** hz: 60 dpi: 97 size: 673x284mm (26.5x11.2")
  diag: 730mm (28.8")
  **Monitor-2: DP-1 res: 2560x1440** hz: 60 dpi: 109 size: 598x336mm (23.5x13.2")
  diag: 686mm (27")

and

  **Screen-1**: 0 s-res: **2560x3720** s-dpi: 120 s-size: 541x787mm (21.3x31.0")
  s-diag: 955mm (37.6")
  **Monitor-1: HDMI-1 res: 2560x1080** hz: 60 dpi: 97 size: 673x284mm (26.5x11.2")
  diag: 730mm (28.8")
  **Monitor-2: HDMI-2 res: 1920x1200** hz: 60 dpi: 94 size: 519x324mm (20.4x12.8") # DVI, not HDMI
  diag: 612mm (24.1")
  **Monitor-3: DP-1 res: 2560x1440** hz: 60 dpi: 109 size: 598x336mm (23.5x13.2")
  diag: 686mm (27")

Also, is it correct that you used xrandr to enter the settings for each of the displays?

For sure. The only monitors I have a pair of is 1920x1200, one of which is committed to this my main PC, making it unavailable for the test systems. Each of the two produces a fully accessible rectangular desktop when stacked as I have them. Side by side there would be an inaccessible area 2560x360 using the 2560x1440 & 2560x1080, and such an area I don’t care to think about explaining using all three side by side or in an L configuration.

Also, is it correct that you used xrandr to enter the settings for each of the displays?
It is, though more complicated than required. The explicit resolutions were not necessary to the result, the primary purpose of which was layout, so DPI was not required either. Specifying primary is a matter of preference. Configuring globally via xrandr means the configuration is applied regardless which user is logged in, or which DE or window manager in use.

mrmazda -

Thanks for the additional information.

Everything I’ve learned over the past week indicates that an HDMI output and a DisplayPort output from integrated graphics will drive the two monitors. Nevertheless, I have not ruled out a discrete graphics card, notwithstanding the additional expense, power requirements, and heat generation, as the discrete card requires very little in the way of configuration in KDE and will offer uniform connections (e.g., DisplayPort).

That’s no different from using IGP. X connection names vary primarily by DDX, less by GPU, not at all by DE or WM. Names using modesetting DDX with a single display are consistent across GPU manufacturers. Amdgpu DDX, Intel DDX and Radeon DDX use DisplayPort- rather than DP- used by most others. Most other variations are in the suffixes, some starting with 0, others with 1 or A.

Manual configuration via KDE settings or xrandr or otherwise is only technically required for non-default layout/positioning, and/or non-default tool panel location.

The power/heat factor is significant with any discrete card, generally more so as their prices go up, and require multiple fans or exotic coolers, and “featuring” inability to acquire sufficient current via the motherboard bus installed into. In some locales, excess heat may be a bonus. It’s definitely not in climates where A/C is the norm, such as here.

A motherboard with CPU featuring IGP and multiple graphics outputs doesn’t prevent adding a discrete card if performance from IGP proves inadequate to needs.

Without a discrete graphics card, a much less expensive power supply can be suited to task. My most powerful PS was 550W until someone gave me a perfectly working and very heavy 750W Corsair, which to date I have used only for backup, testing/burn-in, and troubleshooting. Most of my 28 (installed to working 64 bit machines) are 500W or less, as low as 380W not counting several (Dell) OEM models.

Assuming things haven’t changed since this article was published in 2008, https://www.anandtech.com/show/2624/debunking-power-supply-myths/3 explains a too large PS gives up efficiency.

Upgraded the i3-4130: https://forums.opensuse.org/showthread.php/541321-Upgrading-the-Hardware The B450 AORUS ELITE has “HDMI, DVI-D Ports for Multiple Display”: https://www.gigabyte.com/en/Motherboard/B450-AORUS-ELITE-rev-10 No extra GPUs required. amdgpu works well with Tumbleweed.

mrmazda -

Thank you for the additional information. I would much prefer to forego the extra card for all the reasons you state. In the best possible outcome, I will end up with just the motherboard, CPU, RAM, and one m.2 drive, and perhaps an optical drive. The m.2 format and my desire to use my existing monitors are the factors driving this design.

Does DDX refer to the module in the OS that drives the displays? See https://dri.freedesktop.org/wiki/DDX/

What are the acronyms DE and WM? (For folks such as myself who are not as familiar with the inner workings of the various graphics solutions in CPUs and discrete cards, it would be helpful if less-common acronyms were spelled out, unless that would be too much of a burden.)

There are multiple levels of driving displays, so the term “driver” is more commonly used than module. The foundational graphics driver level is in the kernel, enabling e.g. framebuffer text on the vttys. The second graphics driver level is the Device Dependent X (DDX) driver, for running basic X. The next graphics level is beyond my comprehension, involving such elements as Mesa and/or OpenGL.

What are the acronyms DE and WM?
Desktop Environment and Window Manager.