Graphics issue w/ 11.2 & NVIDIA. Unable to get full resolution 1680x1050

Hi,

System: openSuSE 11.2/64 on dual-core AMD w/ 6GB RAM; GeForce 7300 LE; LG L222WS monitor; KDE 4.3.5

The system refuses to use the full resolution of 1680x1050/24. I have tried every trick I have been able to find (except compiling the NVIDIA drivers myself). Now I’d appreciate some help what to try next.

In the error log xorg.99.log I found this error:

=====
(II) Aug 05 08:55:53 NVIDIA(0): Setting mode “1400x1050”
(EE) Aug 05 08:55:53 NVIDIA(0): Failed to allocate primary buffer: out of memory.
(EE) NVIDIA(0): *** Aborting ***

Fatal server error:
AddScreen/ScreenInit failed for driver 0

Please consult the The X.Org Foundation support
at X.Org Wiki - Home
for help.
Please also check the log file at “/var/log/Xorg.99.log” for additional information.

(There might be more interesting lines: please just give me a hint what to look for…)

It all used to work. I.e. smoothly up to openSuSE 10.3 and with some tweaking with 11.2. (I had to manually edit xorg.conf to force the higher resolution).
Now, when I recently updated the NVIDIA driver to 256.35 this problem occurred. Cannot even tweak anymore.

Any ideas how to solve this?

Thanks in advance,

Hank

  1. Maybe a good idea to upload your complete /var/log/Xorg.0.log to pastebin (or similar). Then post link to it here. This may help others to offer potential solutions to your problem.

  2. Lots of threads on the subject concerning nvidia and resolution issues if you search. (I’ve answered a number of them over recent months).

  3. Many nvidia users have found it necessary to use nvidia-xconfig (at runlevel 3 as root) to generate a working xorg.conf file for the proprietary driver to use. Then use nvidia-settings GUI via desktop (runlevel 5) to tweak their graphics configuration. You’ll need root privileges for this:

kdesu nvidia-settings
gnomesu nvidia-settings

Thanks for the reply, I hoped you would show up since I have read through some of your previous comments :slight_smile:

  1. Ok, I’ll do that later today.
  2. Yes, I’ve digged through all threads I could find - even on other distros - but haven’t found a solution yet.
  3. First time set up of X have been from RL 3 and I have tried both Sax2 and the nvidia-xconfig. (and then the GUI for tweaking).

Are you using the nvidia driver precompiled for OpenSuse or did you compile yourself?

Here’s the link to the current Xorg.0.log: pastebin - Hank555-Xorg.0.log - post number 1914416

Dorax: I use the OpenSuSE precompiled driver.

This looks relevant to me :

#(WW) Aug 10 16:27:56 NVIDIA(GPU-0): Unable to read EDID for display device CRT-0
#(II) Aug 10 16:27:56 NVIDIA(0): NVIDIA GPU GeForce 7300 LE (G72) at PCI:7:0:0 (GPU-0)
#(--) Aug 10 16:27:56 NVIDIA(0): Memory: 524288 kBytes
#(--) Aug 10 16:27:56 NVIDIA(0): VideoBIOS: 05.72.22.43.00
#(II) Aug 10 16:27:56 NVIDIA(0): Detected PCI Express Link width: 16X
#(--) Aug 10 16:27:56 NVIDIA(0): Interlaced video modes are supported on this GPU
#(--) Aug 10 16:27:56 NVIDIA(0): Connected display device(s) on GeForce 7300 LE at PCI:7:0:0:
#(--) Aug 10 16:27:56 NVIDIA(0):     CRT-0
#(--) Aug 10 16:27:56 NVIDIA(0): CRT-0: 400.0 MHz maximum pixel clock
#(II) Aug 10 16:27:56 NVIDIA(0): Assigned Display Device: CRT-0
#(WW) Aug 10 16:27:56 NVIDIA(0): No valid modes for "1680x1050"; removing.
#(WW) Aug 10 16:27:56 NVIDIA(0): No valid modes for "1600x1024"; removing.

ie it appears it can not identify the ‘monitor’ in this PC, and hence the assigned device it is applying as a work around does not have the resolution you want.

There are ways to tell openSUSE how to ID the monitor, but I don’t know how to do it. Hopefully (if my assessment is correct) someone else who knows how to do this will chime in.

Oldcpu,
I don’t know if this monitor model (LG Flatron L222WS) sends a valid EDID. It is not listed in the monitor list in Sax2, for example.
I also tried it on a Windoze computer - it is not correctly recognized there, either.
Up to around OpenSuSE10.3 (whatever version the NVIDIA driver was at that time) I just had to give the correct size, resolution and hor&vert sync parameters (in Sax2) and then it worked.
Since 11.2 I had to manually tweak the xorg.conf, but still managed to get the full resolution. Now that is gone.
I think the second last row in the clip you inserted: “No valid modes for “1680x1050”; Removing” is interesting.
The question is:
Why doesn’t the controller find a valid mode for that resolution? .or.
Is there a way to force the controller to 1680x1050?

Since 11.2 I had to manually tweak the xorg.conf, but still managed to get the full resolution. Now that is gone.

Is there a way to force the controller to 1680x1050?

There are a few different appraoches that can be used, with the most technical being to load a valid EDID from a file location, instead of via your monitor. It involves extracting the buggy EDID, and adjusting it with another utility. I’m reluctant to point you in that direction just yet, although another nvidia power-user may want to assist here.

Instead, I’m going to encourage you to try tweaking some configuration files in the /etc/X11/xorg.conf.d/ directory. If you created xorg.conf with Xorg -configure or nvidia-xconfig, you may need to remove (or rename first).

Try editing /etc/X11/xorg.conf.d/50-monitors.conf to include the following

Section "Monitor"
Identifier "CRT-0"
Option "DPMS"
Modeline "1680x1050" 147.14 1680 1784 1968 2256 1050 1051 1054 1087 +hsync +vsync
Option "PreferredMode" "1680x1050"
EndSection

Note the modeline was generated with the gtf utlity via a terminal with

gtf 1680 1050 60

You may want to adjust the parameters and add to 50-monitors.conf accordingly.

Next edit 50-screen.conf, adding these lines

Defaultdepth 24
       Option          "NoVirtualSizeCheck"
EndSection

When done, restart the X-server with CTRL-ALT-BACKSPACE twice. See how that goes.

BTW, some useful references:

http://www.linuxjournal.com/content/guerrilla-tactics-force-screen-mode-ubuntu

http://wiki.archlinux.org/index.php/NVIDIA

I have no xorg.conf.d, isn’t that coming with 11.3? (Running 11.2) Therefore I couldn’t find the 50-monitors.conf either.

I tried the Modeline with the new parameters, restarted X and tried to set the resolution using the nvidia-settings GUI.

The following error message came up:

“Failed to set MetaMode (1) ‘CRT-0: 1680x1050_60 @1680x1050 +0+0’ (Mode 1680x1050, id: 85) on X screen 0
Would you like to remove this MetaMode?”

The xorg.0.log seems mainly the same, except for three new lines at the bottom:

(EE) Aug 11 00:00:28 NVIDIA(0): Failed to allocate primary buffer: out of memory.
(EE) NVIDIA(0): *** Aborting ***
(II) Aug 11 00:00:28 NVIDIA(0): Setting mode “1400x1050”

BTW: This was the way I tweaked the xorg.conf before: adding a Modeline to xorg.conf. But I had slightly different parameters that I found in another thread regarding the same monitor. Didn’t know of the gtf-utility, thanks for that!

I have no xorg.conf.d, isn’t that coming with 11.3? (Running 11.2) Therefore I couldn’t find the 50-monitors.conf either.

I tried the Modeline with the new parameters, restarted X and tried to set the resolution using the nvidia-settings GUI.

Sorry, my mistake here. The same lines can be added to the appropriate xorg.conf sections of course. I wonder if adding a virtual line to the screen section might also help (just a guess here):

Virtual 1680 1050

BTW: This was the way I tweaked the xorg.conf before: adding a Modeline to xorg.conf. But I had slightly different parameters that I found in another thread regarding the same monitor.

Can you locate that info and try again?

Another idea regarding nvidia-xconfig

nvidia-xconfig(1) - Linux man page

It has options to ignore EDID frequenceis etc

–use-edid-freqs, --no-use-edid-freqs
Enable or disable use of the HorizSync and VertRefresh ranges given in a display device’s EDID, if any. EDID provided range information will override the HorizSync and VertRefresh ranges specified in the Monitor section. This option defaults to TRUE (the NVIDIA X driver will use frequency information from the EDID, when available).

Might be worth a try, then edit xorg.conf manually, and add the custom modeline etc (as explained previously).

I’ve tried both by ignoring only EDID frequencies and the alternative to ignore the whole EDID.
Also added the line Virtual 1680 1050 to the “Screen” section.
The problem is still there.

I found the other Modeline parameters in a couple of places (and I’ve tried those values several times, too).
One was in a thread here on the OpenSuSE forum, another place with some good elaboration on the same issue I found here:
https://bugs.launchpad.net/ubuntu/+source/xorg-server/+bug/212018

I have also tried the xrandr-command (always with the option --dryrun) but haven’t managed to fix it that way, either.

BTW: running the command xrandr --verbose I get a list of display modes, the last on the list being 1680x1050, but with hor & vert sync frequencies outside the range of the monitor.
Does anyone know if I could change them?

One more thing: Does anyone have a valid EDID binary file for this monitor? (LG Flatron L222WS)

BTW: running the command xrandr --verbose I get a list of display modes, the last on the list being 1680x1050, but with hor & vert sync frequencies outside the range of the monitor.
Does anyone know if I could change them?

Did you try explicitly adjusting them in your xorg.conf monitor section?


        HorizSync    30-80
        VertRefresh  48-75

FWIW, I found these threads concerning similar LG displays:

[SOLVED] DVI-D Signal problem (7600gs & lg l204wt) [Archive] - Ubuntu Forums](http://ubuntuforums.org/archive/index.php/t-565750.html)

How To Run An LG L204WT FLATRON LCD Monitor On X Windows - Linux Forums

I reckon this edid.bin file would be ok (despite warning) because native resolution (1680x1050@60Hz) is the same:

nV News Forums - View Single Post - 100.14.19 and XFX 6800XT

THANKS deano_ferrari!

The EDID-file for L204WT finally did it!

Here goes:

  1. Downloaded and extracted the edid file for L204WT (edid.bin) and stored it into /etc/X11
  2. Booted up to RL3 and logged in as root
  3. Ran nvidia-xconfig with the following options:

**nvidia-xconfig -o xorg.conf.tst --custom-edid=CRT-0:/etc/X11/edid.bin
**
As you can see, I actually started with my existing xorg.conf. I made a new (.tst) just to be able to compare exactly the differences as I was curious.

  1. Ran the gtf-command to get the correct modeline-parameters:

gtf 1680 1050 60

  1. Manually added the following two lines to xorg.conf.tst, section “Monitor”:

Modeline “1680x1050_60” 147.14 1680 1784 1968 2256 1050 1051 1054 1087 -HSync +Vsync
Option “PreferredMode” "1680x1050_60"

  1. Copied xorg.conf.tst to /etc/X11/xorg.conf

  2. Started X - and now X even defaults to 1680x1050@24 bit depth!

Of course the nvidia-settings GUI now says monitor “L204WT”, which is incorrect and
since I am vain I’ll try to change that sometime later on.
(Need to find & recalc the checksum bytes)

But now it’s up and running as it should.

Great thanks again!

Thats a good result! I figured (from searching) that it would offer a quick, practical solution to your problem, rather than trying a likely fruitless search for EDID info for your monitor. (Now you can hack it at your liesure if you must edit the monitor model). lol!

Quite often monitors come with CD/DVDs with edid.bin files - but who ever keeps those??!! I’ve been thinking recently, that it would be nice to find/create a utility that could generate edid.bin files based on a set of desired resolutions. This would help others using monitors with buggy EDID in their firmware.

A possible openFate submission ? (although it may be too Linux general and too hardware specific/difficult ?? )

A possible openFate submission ? (although it may be too Linux general and too hardware specific/difficult ?? )

Good suggestion oldcpu. I’ll give it a go.

Well, after some digging and hex-editing I managed to put together a “syntetic” EDID file for LG Electronics Flatron L222WS.
…complete with my monitor’s serial number, mfg date, screen dimensions… :stuck_out_tongue:

Try checking in yast under hardware->hardware information under monitor and see what it says? but ultimately using sax2 fixes the problem insert driver disk from LG and it should automaticly detect the monitor. By selecting have utility disk