Page 2 of 3 FirstFirst 123 LastLast
Results 11 to 20 of 21

Thread: Driving dual monitors with CPU integrated graphics

  1. #11
    Join Date
    Jun 2008
    Location
    Podunk
    Posts
    29,681
    Blog Entries
    15

    Default Re: Driving dual monitors with CPU integrated graphics

    Quote Originally Posted by w2tq View Post
    Thanks. Do I correctly understand your response to mean that a separate interface is not necessary to run in the bridged mode unless required for some other purpose? I.e., both the host and the guest can share a single interface?
    Hi
    That is the case, can share, I don't with my machines, separate bridge to isolate from the system, maybe a quirk with qemu, but always been my preference as get full speed across the network.
    Cheers Malcolm °¿° SUSE Knowledge Partner (Linux Counter #276890)
    SUSE SLE, openSUSE Leap/Tumbleweed (x86_64) | GNOME DE
    If you find this post helpful and are logged into the web interface,
    please show your appreciation and click on the star below... Thanks!

  2. #12

    Default Re: Driving dual monitors with CPU integrated graphics

    Thank you. I think I posed the bridging question in the wrong thread - my apologies (trying to do too many things at once).

  3. #13
    Join Date
    Dec 2008
    Location
    FL, USA
    Posts
    2,385

    Default Re: Driving dual monitors with CPU integrated graphics

    Two monitors with AMD IGP graphics have been supported at least 5 years. Just a matter of hours ago I put one such, resurrected from dead by a drenching in contact cleaner to remove cigarette smoke residue, online:
    Code:
    # rpm -qa | grep xf86-v
    xf86-video-fbdev-0.5.0-lp151.1.2.x86_64
    xf86-video-amdgpu-18.1.0-lp151.1.3.x86_64
    xf86-video-vesa-2.4.0-lp151.2.3.x86_64
    # xrandr --listproviders
    Providers: number : 1
    Provider 0: id: 0x46; cap: 0xf (Source Output, Sink Output, Source Offload, Sink Offload); crtcs: 4; outputs: 3; associated providers: 0; name: modesetting
        output HDMI-1
        output DP-1
        output VGA-1
    # grep -v ^\# /etc/X11/xinit/xinitrc.d/setup | grep xrandr
    xrandr --dpi 120 --output DP-1 --mode 2560x1440 --primary --output HDMI-1 --mode 2560x1080 --above DP-1 #
    # xrandr | egrep 'onnect|creen|\*' | grep -v disconn | sort -r
    Screen 0: minimum 320 x 200, current 2560 x 2520, maximum 16384 x 16384
    HDMI-1 connected 2560x1080+0+0 (normal left inverted right x axis y axis) 673mm x 284mm
    DP-1 connected primary 2560x1440+0+1080 (normal left inverted right x axis y axis) 598mm x 336mm
       2560x1440     59.95*+  74.92
       2560x1080     60.00*+
    # inxi -V | head -n1
    inxi 3.1.06-00 (2020-08-17)
    # inxi -GMSay
    System:
      Host: asa88 Kernel: 4.12.14-lp151.28.67-default x86_64 bits: 64
      compiler: gcc v: 7.5.0
      parameters: BOOT_IMAGE=/boot/vmlinuz noresume ipv6.disable=1 net.ifnames=0
      mitigations=auto consoleblank=0 video=1440x900@60 5
      Desktop: Trinity R14.0.8 tk: Qt 3.5.0 info: kicker wm: Twin 3.0 dm: TDM
      Distro: openSUSE Leap 15.1
    Machine:
      Type: Desktop Mobo: ASUSTeK model: A88X-PRO v: Rev X.0x
      serial: 140323952800121 UEFI: American Megatrends v: 2603 date: 03/10/2016
    Graphics:
      Device-1: AMD Kaveri [Radeon R7 Graphics] vendor: ASUSTeK driver: radeon
      v: kernel alternate: amdgpu bus ID: 00:01.0 chip ID: 1002:130f
      Display: server: X.Org 1.20.3 driver: modesetting unloaded: fbdev,vesa
      alternate: ati display ID: :0 screens: 1
      Screen-1: 0 s-res: 2560x2520 s-dpi: 120 s-size: 541x533mm (21.3x21.0")
      s-diag: 759mm (29.9")
      Monitor-1: HDMI-1 res: 2560x1080 hz: 60 dpi: 97 size: 673x284mm (26.5x11.2")
      diag: 730mm (28.8")
      Monitor-2: DP-1 res: 2560x1440 hz: 60 dpi: 109 size: 598x336mm (23.5x13.2")
      diag: 686mm (27")
      OpenGL:
      renderer: AMD KAVERI (DRM 2.50.0 4.12.14-lp151.28.67-default LLVM 7.0.1)
      v: 4.5 Mesa 18.3.2 direct render: Yes
    Note it's here configured to use the upstream default DDX, modesetting, rather than the amdgpu DDX, which I did try first. The amdgpu DDX (as does the intel DDX) results in non-uniform CRTC names, which confuses my global method of manual display setup via xrandr script. No behavioral desktop differences between the two were evident.

    This is 3 displays from a 3 year old Intel PC's IGP:
    Code:
    # rpm -qa | grep xf86-v
    xf86-video-vesa-2.4.0-lp151.2.3.x86_64
    xf86-video-fbdev-0.5.0-lp151.1.2.x86_64
    # xrandr --listproviders
    Providers: number : 1
    Provider 0: id: 0x47; cap: 0xf (Source Output, Sink Output, Source Offload, Sink Offload); crtcs: 3; outputs: 5; associated providers: 0; name: modesetting
        output HDMI-1	# HDMI actually
        output HDMI-2	# DVI actually
        output DP-1		# DisplayPort
        output HDMI-3	# non-existent output, or dummy
        output DP-2		# VGA actually
    # grep -v ^\# /etc/X11/xinit/xinitrc.d/setup | grep xrandr
    xrandr --dpi 120 --output DP-1 --mode 2560x1440 --primary --output HDMI-1 --mode 2560x1080 --above DP-1 --output HDMI-2 --mode 1920x1200 --above HDMI-1 #
    # xrandr | egrep 'onnect|creen|\*' | grep -v disconn | sort -r
    Screen 0: minimum 320 x 200, current 2560 x 3720, maximum 8192 x 8192
    HDMI-2 connected 1920x1200+0+0 (normal left inverted right x axis y axis) 519mm x 324mm # this is actually a DVI port connection
    HDMI-1 connected 2560x1080+0+1200 (normal left inverted right x axis y axis) 673mm x 284mm
    DP-1 connected primary 2560x1440+0+2280 (normal left inverted right x axis y axis) 598mm x 336mm
       2560x1440     59.95*+  74.92
       2560x1080     60.00*+
       1920x1200     59.95*+
    # inxi -V | head -n1
    inxi 3.1.06-00 (2020-08-17)
    # inxi -GMSay
    System:
      Host: gb250 Kernel: 4.12.14-lp151.28.59-default x86_64 bits: 64
      compiler: gcc v: 7.5.0
      parameters: BOOT_IMAGE=/boot/vmlinuz noresume ipv6.disable=1 net.ifnames=0
      mitigations=auto consoleblank=0 video=1440x900@60 5
      Desktop: Trinity R14.0.8 tk: Qt 3.5.0 info: kicker wm: Twin 3.0 dm: TDM
      Distro: openSUSE Leap 15.1
    Machine:
      Type: Desktop System: Gigabyte product: B250M-D3H v: N/A serial: N/A
      Mobo: Gigabyte model: B250M-D3H-CF v: x.x serial: N/A
      UEFI: American Megatrends v: F10 date: 12/14/2018
    Graphics:
      Device-1: Intel HD Graphics 630 vendor: Gigabyte driver: i915 v: kernel
      bus ID: 00:02.0 chip ID: 8086:5912
      Display: server: X.Org 1.20.3 driver: modesetting unloaded: fbdev,vesa
      alternate: intel display ID: :0 screens: 1
      Screen-1: 0 s-res: 2560x3720 s-dpi: 120 s-size: 541x787mm (21.3x31.0")
      s-diag: 955mm (37.6")
      Monitor-1: HDMI-1 res: 2560x1080 hz: 60 dpi: 97 size: 673x284mm (26.5x11.2")
      diag: 730mm (28.8")
      Monitor-2: HDMI-2 res: 1920x1200 hz: 60 dpi: 94 size: 519x324mm (20.4x12.8") # DVI, not HDMI
      diag: 612mm (24.1")
      Monitor-3: DP-1 res: 2560x1440 hz: 60 dpi: 109 size: 598x336mm (23.5x13.2")
      diag: 686mm (27")
      OpenGL: renderer: Mesa DRI Intel HD Graphics 630 (Kaby Lake GT2)
      v: 4.5 Mesa 18.3.2 compat-v: 3.0 direct render: Yes
    This also is set to use the very same upstream default DDX, modesetting, instead of the Intel specific DDX.

    On neither PC is xorg-x11-driver-video installed.

    Check out this primer regarding X driver naming, configuration and requirements.
    Reg. Linux User #211409 *** multibooting since 1992
    Primary: 15.1, TW, 15.2 & 13.1 on Haswell w/ RAID
    Secondary: eComStation (OS/2)&15.1 on i965P/Radeon
    Tertiary: TW,15.2,15.1,Fedora,Debian,more on Kaby Lake,iQ45,iQ43,iG41,iG3X,i965G,AMD,NVidia&&&

  4. #14

    Arrow Re: Driving dual monitors with CPU integrated graphics

    mrmazda -

    Thank you for the response and terminal outputs.

    If I am reading this correctly, in both cases (2 and 3 monitors, respectively), the monitors in each setup are running at different resolutions:
    Code:
      Screen-1: 0 s-res: 2560x2520 s-dpi: 120 s-size: 541x533mm (21.3x21.0")
      s-diag: 759mm (29.9")
      Monitor-1: HDMI-1 res: 2560x1080 hz: 60 dpi: 97 size: 673x284mm (26.5x11.2")
      diag: 730mm (28.8")
      Monitor-2: DP-1 res: 2560x1440 hz: 60 dpi: 109 size: 598x336mm (23.5x13.2")
      diag: 686mm (27")
    and
    Code:
      Screen-1: 0 s-res: 2560x3720 s-dpi: 120 s-size: 541x787mm (21.3x31.0")
      s-diag: 955mm (37.6")
      Monitor-1: HDMI-1 res: 2560x1080 hz: 60 dpi: 97 size: 673x284mm (26.5x11.2")
      diag: 730mm (28.8")
      Monitor-2: HDMI-2 res: 1920x1200 hz: 60 dpi: 94 size: 519x324mm (20.4x12.8") # DVI, not HDMI
      diag: 612mm (24.1")
      Monitor-3: DP-1 res: 2560x1440 hz: 60 dpi: 109 size: 598x336mm (23.5x13.2")
      diag: 686mm (27")
    Also, is it correct that you used xrandr to enter the settings for each of the displays?

  5. #15
    Join Date
    Dec 2008
    Location
    FL, USA
    Posts
    2,385

    Default Re: Driving dual monitors with CPU integrated graphics

    Quote Originally Posted by w2tq View Post
    If I am reading this correctly, in both cases (2 and 3 monitors, respectively), the monitors in each setup are running at different resolutions.
    For sure. The only monitors I have a pair of is 1920x1200, one of which is committed to this my main PC, making it unavailable for the test systems. Each of the two produces a fully accessible rectangular desktop when stacked as I have them. Side by side there would be an inaccessible area 2560x360 using the 2560x1440 & 2560x1080, and such an area I don't care to think about explaining using all three side by side or in an L configuration.

    Also, is it correct that you used xrandr to enter the settings for each of the displays?
    It is, though more complicated than required. The explicit resolutions were not necessary to the result, the primary purpose of which was layout, so DPI was not required either. Specifying primary is a matter of preference. Configuring globally via xrandr means the configuration is applied regardless which user is logged in, or which DE or window manager in use.
    Reg. Linux User #211409 *** multibooting since 1992
    Primary: 15.1, TW, 15.2 & 13.1 on Haswell w/ RAID
    Secondary: eComStation (OS/2)&15.1 on i965P/Radeon
    Tertiary: TW,15.2,15.1,Fedora,Debian,more on Kaby Lake,iQ45,iQ43,iG41,iG3X,i965G,AMD,NVidia&&&

  6. #16

    Default Re: Driving dual monitors with CPU integrated graphics

    mrmazda -

    Thanks for the additional information.

    Everything I've learned over the past week indicates that an HDMI output and a DisplayPort output from integrated graphics will drive the two monitors. Nevertheless, I have not ruled out a discrete graphics card, notwithstanding the additional expense, power requirements, and heat generation, as the discrete card requires very little in the way of configuration in KDE and will offer uniform connections (e.g., DisplayPort).

  7. #17
    Join Date
    Dec 2008
    Location
    FL, USA
    Posts
    2,385

    Default Re: Driving dual monitors with CPU integrated graphics

    Quote Originally Posted by w2tq View Post
    the discrete card requires very little in the way of configuration in KDE and will offer uniform connections (e.g., DisplayPort).
    That's no different from using IGP. X connection names vary primarily by DDX, less by GPU, not at all by DE or WM. Names using modesetting DDX with a single display are consistent across GPU manufacturers. Amdgpu DDX, Intel DDX and Radeon DDX use DisplayPort- rather than DP- used by most others. Most other variations are in the suffixes, some starting with 0, others with 1 or A.

    Manual configuration via KDE settings or xrandr or otherwise is only technically required for non-default layout/positioning, and/or non-default tool panel location.

    The power/heat factor is significant with any discrete card, generally more so as their prices go up, and require multiple fans or exotic coolers, and "featuring" inability to acquire sufficient current via the motherboard bus installed into. In some locales, excess heat may be a bonus. It's definitely not in climates where A/C is the norm, such as here.

    A motherboard with CPU featuring IGP and multiple graphics outputs doesn't prevent adding a discrete card if performance from IGP proves inadequate to needs.

    Without a discrete graphics card, a much less expensive power supply can be suited to task. My most powerful PS was 550W until someone gave me a perfectly working and very heavy 750W Corsair, which to date I have used only for backup, testing/burn-in, and troubleshooting. Most of my 28 (installed to working 64 bit machines) are 500W or less, as low as 380W not counting several (Dell) OEM models.

    Assuming things haven't changed since this article was published in 2008, https://www.anandtech.com/show/2624/...supply-myths/3 explains a too large PS gives up efficiency.
    Reg. Linux User #211409 *** multibooting since 1992
    Primary: 15.1, TW, 15.2 & 13.1 on Haswell w/ RAID
    Secondary: eComStation (OS/2)&15.1 on i965P/Radeon
    Tertiary: TW,15.2,15.1,Fedora,Debian,more on Kaby Lake,iQ45,iQ43,iG41,iG3X,i965G,AMD,NVidia&&&

  8. #18
    Join Date
    Jan 2014
    Location
    Erlangen
    Posts
    1,914
    Blog Entries
    1

    Default Re: Driving dual monitors with CPU integrated graphics

    Quote Originally Posted by w2tq View Post
    I am choosing components for a business, non-gaming desktop PC, with dual, identical NEC monitors (extended display, not cloned). The native OS will be the current version of openSuSE Linux with a virtual machine (VirtualBox) for Windows. Typical applications would be word processing, pdf generation, Internet browsing, and image rendering (Photoshop, GIMP); no video rendering.

    Until now, I have used discrete nVidia-based graphics cards with dual DVI-I outputs, each monitor set to a resolution of 1920 x 1200. These cards have worked well with openSuSE, requiring very little in the way of configuration, especially in recent years.

    Nevertheless, given the performance advances in CPUs, I am exploring CPU integrated graphics offered in AMD Ryzen 5 and Intel Core i3 and i5 series microprocessors as an alternative to a discrete card. In addition to DVI-I inputs, the monitors have one HDMI input and one DisplayPort input (daisy chaining is not an option), and the motherboards I am considering have HDMI and DisplayPort outputs, apparently a fairly standard configuration.

    One concern I have is whether running one monitor from an HDMI output and the other from a DisplayPort will afford the same uniform video signals to the two monitors (see, e.g., https://forums.opensuse.org/showthre...rated+graphics). Some claim that a port splitter may be employed, but others run into performance or configuration difficulties with such devices.

    Since I have very little experience with integrated graphics, HDMI, and DisplayPort, the effectiveness of these options appears uncertain. Although a discrete card adds expense, increases power consumption, and generates additional heat, it works and works well. Perhaps that's a more sensible way to proceed.

    I am interested in your experience with integrated graphics and dual monitors, and welcome your comments and suggestions.
    Upgraded the i3-4130: https://forums.opensuse.org/showthre...g-the-Hardware The B450 AORUS ELITE has "HDMI, DVI-D Ports for Multiple Display": https://www.gigabyte.com/en/Motherbo...S-ELITE-rev-10 No extra GPUs required. amdgpu works well with Tumbleweed.
    AMD Athlon 4850e (2009), openSUSE 13.1, KDE 4, Intel i3-4130 (2014), i7-6700K (2016), i5-8250U (2018), AMD Ryzen 5 3400G (2020), openSUSE Tumbleweed, KDE Plasma 5

  9. #19

    Default Re: Driving dual monitors with CPU integrated graphics

    mrmazda -

    Thank you for the additional information. I would much prefer to forego the extra card for all the reasons you state. In the best possible outcome, I will end up with just the motherboard, CPU, RAM, and one m.2 drive, and perhaps an optical drive. The m.2 format and my desire to use my existing monitors are the factors driving this design.

    Does DDX refer to the module in the OS that drives the displays? See https://dri.freedesktop.org/wiki/DDX/

    What are the acronyms DE and WM? (For folks such as myself who are not as familiar with the inner workings of the various graphics solutions in CPUs and discrete cards, it would be helpful if less-common acronyms were spelled out, unless that would be too much of a burden.)

  10. #20
    Join Date
    Dec 2008
    Location
    FL, USA
    Posts
    2,385

    Default Re: Driving dual monitors with CPU integrated graphics

    Quote Originally Posted by w2tq View Post
    Does DDX refer to the module in the OS that drives the displays? See https://dri.freedesktop.org/wiki/DDX/
    There are multiple levels of driving displays, so the term "driver" is more commonly used than module. The foundational graphics driver level is in the kernel, enabling e.g. framebuffer text on the vttys. The second graphics driver level is the Device Dependent X (DDX) driver, for running basic X. The next graphics level is beyond my comprehension, involving such elements as Mesa and/or OpenGL.

    What are the acronyms DE and WM?
    Desktop Environment and Window Manager.
    Reg. Linux User #211409 *** multibooting since 1992
    Primary: 15.1, TW, 15.2 & 13.1 on Haswell w/ RAID
    Secondary: eComStation (OS/2)&15.1 on i965P/Radeon
    Tertiary: TW,15.2,15.1,Fedora,Debian,more on Kaby Lake,iQ45,iQ43,iG41,iG3X,i965G,AMD,NVidia&&&

Page 2 of 3 FirstFirst 123 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •