OpenSuse Leap 42.1 and integrated Intel HD Graphics 4400

Hello all -

I’ve just installed OpenSuSE Leap 42.1 on my Dell Optiplex 3020 with integrated Intel HD Graphics 4400. From previous usage with windows O/S, I know this graphic card (and the monitor) both do 1920x1080 as the normal resolution.

For some reason, the only 2 resolutions showing in the OpenSuSE “Display and Monitor” settings are 800x600 and 1024x768. OpenSuSE set the res to 1024x768, and there are no higher resolutions available.

At the risk of displaying my lack of technical prowess with Linux, can anyone tell me how to add resolutions, specifically 1920x1080? I would greatly appreciate any guidance.

In my Xorg.0.log file, I find these entries:


16.471] (II) intel: Driver for Intel(R) HD Graphics: 2000-6000
16.471] (II) intel: Driver for Intel(R) Iris(TM) Graphics: 5100, 6100
16.471] (II) intel: Driver for Intel(R) Iris(TM) Pro Graphics: 5200, 6200, P6300
16.471] (II) modesetting: Driver for Modesetting Kernel Drivers: kms
16.471] (II) FBDEV: driver for framebuffer: fbdev
16.471] (II) VESA: driver for VESA chipsets: vesa
16.473] (II) intel(0): Using Kernel Mode Setting driver: i915, version 1.6.0 20150327
16.473] (WW) Falling back to old probe method for modesetting
16.473] (WW) Falling back to old probe method for fbdev
16.473] (II) Loading sub module "fbdevhw"
16.473] (II) LoadModule: "fbdevhw"
16.473] (II) Loading /usr/lib64/xorg/modules/libfbdevhw.so
16.473] (II) Module fbdevhw: vendor="X.Org Foundation"
16.473]     compiled for 1.17.2, module version = 0.0.2
16.473]     ABI class: X.Org Video Driver, version 19.0
16.473] (WW) Falling back to old probe method for vesa
16.473] (--) intel(0): Integrated Graphics Chipset: Intel(R) HD Graphics
16.473] (--) intel(0): CPU: x86-64, sse2, sse3, ssse3, sse4.1, sse4.2, avx, avx2
16.473] (II) intel(0): Creating default Display subsection in Screen section
"Default Screen Section" for depth/fbbpp 24/32
16.473] (==) intel(0): Depth 24, (--) framebuffer bpp 32
16.473] (==) intel(0): RGB weight 888
16.473] (==) intel(0): Default visual is TrueColor
16.473] (II) intel(0): Output VGA1 has no monitor section
16.473] (II) intel(0): Enabled output VGA1
16.473] (II) intel(0): Output DP1 has no monitor section
16.473] (II) intel(0): Enabled output DP1
16.473] (II) intel(0): Output HDMI1 has no monitor section
16.473] (II) intel(0): Enabled output HDMI1
16.473] (--) intel(0): Using a maximum size of 256x256 for hardware cursors
16.473] (II) intel(0): Output VIRTUAL1 has no monitor section
16.473] (II) intel(0): Enabled output VIRTUAL1
16.473] (--) intel(0): Output VGA1 using initial mode 1024x768 on pipe 0
16.473] (==) intel(0): TearFree disabled
16.473] (==) intel(0): DPI set to (96, 96)
16.473] (II) Loading sub module "dri2"
16.473] (II) LoadModule: "dri2"
16.473] (II) Module "dri2" already built-in
16.473] (II) Loading sub module "present"
16.473] (II) LoadModule: "present"
16.473] (II) Module "present" already built-in
16.473] (II) UnloadModule: "modesetting"
16.473] (II) Unloading modesetting
16.473] (II) UnloadModule: "fbdev"
16.473] (II) Unloading fbdev
16.473] (II) UnloadSubModule: "fbdevhw"
16.473] (II) Unloading fbdevhw
16.473] (II) UnloadModule: "vesa"
16.473] (II) Unloading vesa

Hi all -

The mystery has been solved. I’m posting this for the benefit of any others that may run into this problem.

Here’s the solution (based on a ton of research):

I have several PCs, running various O/Ss (windows and linux, including one server), on a home network. All the PCs are in a server rack in a bedroom I turned into a home computer room.

I use a multi-port KVM Switch, so that I don’t have to have 8 keyboards, 8 mice and 8 monitors. The KVM switch I have is a 10-year old D-Link DKVM-16 model. Apparently, this model was not engineered with the ability to pass EDID (Electronic Device Identification Data) over the DDC (Display Data Channel), from the physical monitor to whatever graphics cards were connecting to the kvm ports.

The graphics cards poll the monitor, over the DDC to collect the monitor’s EDID in an attempt to auto-detect and set the video resolution. If the KVM doesn’t allow DDC communications, then the graphics card gets nothing from the monitor, and then defaults to 1024x768 (or less).

I tested this by disconnecting my monitor from the KVM and directly connecting to various linux distro boxes of mine (OpenSuSE Leap 42.1, Ubuntu 15.10, Linux Mint 17.3, Ubuntu MATE 15.10). ALL of these linux boxes were having the same limited 1024x768 resolution problems when connecting to the monitor through the KVM. Once DIRECTLY connected to the monitor, they all immediately polled and detected the monitor’s proper 1920x1024 optimum resolution, and set that accordingly in their own environment.

I have since ordered a replacement KVM Switch (IOGear GCS138) which properly passes DDCB2 data between the monitor and connected ports. This allows graphics cards in linux boxes to auto-detect the monitor resolution by grabbing the EDID from the monitor.

I have not received the new KVM switch yet, but will post a follow-up once I have installed it and tested.