Have just got an LG 34WL500 monitor that is capable of 2560x1080, and this works fine when booted into Windows10.
Unfortunately it seems not to work in any Linux distro… (Max Res 1920x1080)
Is there any simple fix for this, or is it a case of having to go “under the hood”?? (And if so, how…?)
Just using the standard on-board graphics.
Unfortunately the monitor only has 2x hdmi ports, and am using a Display Port to HDMI adaptor.
Did try adding a 3rd party card with HDMI out, as well, but still the Max Res is 1920x1080.
Hi
And when you say booting into windows, this is the same machine dual booting? Another thought, can you see the graphics memory in the BIOS, if so crank that up to max if possible.
IME, DisplayPort to HDMI adapters usually work, but not necessarily on the first try. I have two such, and a Dell 2560x1080. In trying to replicate your probem, 15.1 managed only 1152x864 on the first try with my first adapter (Geekbuying). It wasn’t until I cold booted (via reset button) that I was able to get my other adapter (Coboc ACAD-DP2HD4KS-6BK) to produce 2560x1080 on 15.1:
$ inxi -V | head -n1
inxi 3.0.37-00 (2019-11-19)
$ inxi -SGxx
System: Host: ab250 Kernel: 4.12.14-lp151.28.36-default x86_64 bits: 64 compiler: gcc v: 7.4.1 Desktop: Trinity R14.0.7
tk: Qt 3.5.0 wm: Twin dm: TDM Distro: openSUSE Leap 15.1
Graphics: Device-1: Intel HD Graphics 630 vendor: ASUSTeK driver: i915 v: kernel bus ID: 00:02.0 chip ID: 8086:5912
Display: server: X.Org 1.20.3 driver: modesetting unloaded: fbdev,vesa alternate: intel resolution: 2560x1080~60Hz
OpenGL: renderer: Mesa DRI Intel HD Graphics 630 (Kaby Lake GT2) v: 4.5 Mesa 18.3.2 compat-v: 3.0
direct render: Yes
$ xrandr | egrep 'onnect|creen|\*' | grep -v disconn | sort -r
Screen 0: minimum 320 x 200, current 2560 x 1080, maximum 8192 x 8192
DP-1 connected primary 2560x1080+0+0 (normal left inverted right x axis y axis) 673mm x 284mm
2560x1080 60.00*+
Note the reported connection is DisplayPort (DP-1), but the DP-1 output is on a DisplayPort-to-HDMI adapter, with the HDMI end connected to the display.
So, things I suggest to try include:
cold booting to 15.1
malcolmlewis’ suggestion
LG firmware upgrade (if available)
Motherboard BIOS upgrade (if available)
Different DisplayPort-to-HDMI adapter (if yours is returnable locally, try an exchange if vendor has no other option)
If those don’t work, you can try matching my Haswell/15.1 setup by uninstalling Plymouth:
sudo zypper rm plymouth; sudo zypper al plymouth
Whether no plymouth might actually help I don’t know. It’s just that I never have it installed due in part to its history of complicating video troubleshooting. You can test its removal by appending plymouth.enable=0 to the end of the linux line in your Grub menu. Get there by using the E key when the menu appears. If the test works but you don’t wish to actually remove plymouth, you can include plymouth.enable=0 on the GRUB_CMDLINE_LINUX_DEFAULT= line in /etc/default/grub. Next update of your Grub menu will apply it, which will last until you remove it from /etc/default/grub.
Without any adapters, a Haswell is generally more than capable of 2560x1080 support. I say generally because my old Haswell G3220 GPU’s advertised spec says HDMI is limited to 1920x1200. My current Haswell i3-4150T GPU supports up to 4096x2304 on HDMI. Based on your GPU’s
Xeon E3-1200 v3/4th Gen Core Processor Integrated Graphics Controller
description, if yours cannot be made to work, I would suspect it to be possible that there is some tweak in the Windows driver for your actual GPU or your LG that fools the display into ignoring the advertised HDMI limitation imposed by using an adapter when the actual video output is DisplayPort.
This is from the PC I’m typing this from, 8086:041e, same family as your 8086:0412:
Different DisplayPort-to-HDMI adapter (if yours is returnable locally, try an exchange if vendor has no other option) – Assume that if the adaptor works for Windows, should work for Linux??
The ***** indicates the current mode. Unfollowed by + it indicates X knows the current mode is not the display’s native/preferred mode.
Model name: Intel(R) Core™ i5-4670 CPU @ 3.40GHz
New information, as malcolmlewis indicates, Max Resolution (HDMI 1.4)‡ 4096x2304**@24Hz**, Max Resolution (DP)‡ 3840x2160@60Hz. Note >1920x1200 @60Hz is known based on information presented to be supported only by DP on the i5-4670. This appears controverted by working as expected on Windows.
ntroller [8086:0412] (rev 06)
Duplicates PCI_ID=8086:0412 from OP without reporting the DDX (X driver) in use. Inxi -Gxx reports PCI ID (8086:0412), kernel driver (i915), and DDX (?).
Video options in BIOS cranked up…
What exactly does this mean?
Assume that if the adaptor works for Windows, should work for Linux??
Reasonable assumption, but not every tweak resulting from coordination between Microsoft and hardware manufacturers makes it into Linux before someone reports a shortcoming to the FOSS driver writers, hence my last list item in post #4.
Note another difference between my 15.1 in post #4 and your xrandr output is that the CRTC connection in my Kaby Lake is the more robust DisplayPort (DP-1), while yours and my Haswell are seeing HDMI-1. This suggests to me some correction has been made somewhere in X for newer GPUs.
Manually creating a modeline with CVT or GTF doesn’t produce any better modeline than X does itself given correct data to work with. When EDID is defective, experiences like yours are common. Instead of calculating with CVT, give X correct specifications via xorg.conf:
I have yet since Xorg forked off XFree86 to find a CVT- or GTF-generated modeline necessary or preferable when correct HorizSync and VertRefresh specifications are provided this way.
OK, tried the other approach…
Created an /etc/X11/xorg.conf. Slightly different for opensuse vs ubuntu.
Opensuse: Replace the already created /etc/X11/xorg.conf.d/50-monitor.conf, with the original file (basically an empty file - all text commented out), then edit the /etc/X11/xorg.conf.install and save as xorg.conf. Just added a Monitor section. In my case:
I don’t see any material “approach” difference. xorg.conf is a file that contains all Sections. xorg.conf.d/ files are each for individual Sections. A material “approach” difference would be letting Xorg calculate modelines from HorizSync and VertRefresh, whether in xorg.conf or a file in xorg.conf.d/. There should be zero difference between openSUSE and Ubuntu in this regard. The main difference between the two is in Ubuntu xorg.conf.d/ typically doesn’t exist unless and until created by the local admin, as it’s an optional directory that Ubuntu doesn’t normally populate by default, while openSUSE has traditionally used it, which might not be happening any more since the implementation of /usr/etc/ in TW.
All I know is that I configured the xorg.conf file in FerenOS (xorg.conf file generated by booting into a root shell and running X -configure), and this worked after adding the Monitor section…
Then tried copying this xorg.conf file to my Leap15.1 install, and could only boot to a shell, so went through the above process of editing the xorg.conf.install.
Thanks.