11.3 and Monitor Detection

I opted to blow away my system (no did not save xorg.conf), and do a clean install of 11.3. Not a big deal to blow it away, nothing lost.

Dell Dimension C521, 3GB RAM, Athlon DualCore
OpenSuse 11.3 x64
Nvidia GeForce 6150LE (on mainboard)
Dell 2048FP 24" Monitor, connected via VGA

When doing the install, all appears to go well. I added the “nomodeset” option after the first boot to avoid the graphics issues.

KDE comes up and is in 800x600 mode at 50hz. That is the best resolution it will show. All other options in the configure desktop applet are lower. This monitor was driven at 1900x1200 on the same config with 11.2.

I loaded the latest nVidia native driver (and the previous one as a test), and when they are loaded, the resolution jumps to 1024x768…better, but not good.

Running xrandr shows:
Screen 0: minimum 320 x 240, current 1024 x 768, maximum 1024 x 768
default connected 1024x768+0+0 0mm x 0mm
1024x768 50.0*
800x600 51.0 52.0 53.0
640x480 54.0
512x384 55.0
400x300 56.0 57.0 58.0
320x240 59.0

In the nVidia-Settings applet (run as root), it shows the display as “CRT-0”. 11.2 showed it as a Dell…

Any ideas on how to make it detect the monitor properly?

Thx in advance

openSUSE Graphic Card Practical Theory Guide for Users

Try following the instructions from this link:

SDB:Configuring graphics cards - openSUSE

I loaded the latest nVidia native driver (and the previous one as a test), and when they are loaded, the resolution jumps to 1024x768…better, but not good.

Hi mhibist.

See if this thread helps. Specifically, post#25 (by whych) may be helpful to your situation. The idea is to generate a basic xorg.conf including your desired display mode(s), along with the relevant modeline(s). This legacy approach may work for you too. If you need more help with this, post again. You will need to create this file with an editor with superuser privileges:

kdesu kwrite /etc/X11/xorg.conf

Running ‘nvidia-xconfig’ from a terminal (as root), then restarting the X-server may also work for you.

Thanks for the insight…i looked some of that over and I can’t make it work…however, I did find this in Xorg.0.log:

17.114] (**) NVIDIA(0): Enabling RENDER acceleration
17.114] (II) NVIDIA(0): Support for GLX with the Damage and Composite X extensions is
17.114] (II) NVIDIA(0):     enabled
17.613] **(WW) NVIDIA(GPU-0): Unable to read EDID for display device CRT-0**
17.614] (II) NVIDIA(0): NVIDIA GPU GeForce 6150 LE (C51) at PCI:0:5:0 (GPU-0)
17.614] (--) NVIDIA(0): **Memory: 524288 kBytes**
17.614] (--) NVIDIA(0): VideoBIOS: 05.51.28.42.00
17.614] (--) NVIDIA(0): Interlaced video modes are supported on this GPU
17.614] (--) NVIDIA(0): Connected display device(s) on GeForce 6150 LE at PCI:0:5:0:
17.614] (--) NVIDIA(0):     CRT-0
17.614] (--) NVIDIA(0): CRT-0: 350.0 MHz maximum pixel clock
17.615] (II) NVIDIA(0): Assigned Display Device: CRT-0
17.615] (==) NVIDIA(0): 
17.615] (==) NVIDIA(0): No modes were requested; the default mode "nvidia-auto-select"
17.615] (==) NVIDIA(0):     will be used as the requested mode.

The EDID is not being read, which I suspect is the root of the problem. Unfortunately, I do not know the EDID info to generate a custom one with.

The other odd thing is that its reporting 512MB of memory…the BIOS has it configured at 128MB (the max).

Thx again

Do you have a spec sheet for your monitor? I’m assuming it may be capable of 1920x1200 (based on similar models with this display resolution).

You can generate a modeline for a given display mode with the gtf utility. For example:

dean@linux-8dcq:~> gtf 1920 1200 60

  # 1920x1200 @ 60.00 Hz (GTF) hsync: 74.52 kHz; pclk: 193.16 MHz
  Modeline "1920x1200_60.00"  193.16  1920 2048 2256 2592  1200 1201 1204 1242  -HSync +Vsync

Note the identifier “1920x1200_60.00”. (I would change that to “1920x1200” - its then easier to reference in the screen section). Then add that to xorg.conf along with the other changes.

Deano,

Thanks for the info…I finally threw up my hands and installed a GeForce 8400GS based card ($35), and WHAM! 1920x1200 resolution. The EDID was read just fine through the DVI port. My guess is an issue between Suse and the GeForce 6150LE controller.

Now I have another issue…memory leak, but that is another thread…lol

Thanks again!

My guess is an issue between Suse and the GeForce 6150LE controller.

More likely an issue between the driver and the gfx card.

Good luck with the other issue.