I’m having to use my flat screen TV as my monitor for the time being as I’ve leant my main monitor to a friend, not a problem it all looks nice and smart on the desk. However, the nvidia driver (or the nouveau) driver cannot obtain an EDID for it as I can see from this snippet of the /var/log/xorg.0.log file
(WW) NVIDIA(GPU-0): The EDID read for display device CRT-1 is invalid:
(WW) NVIDIA(GPU-0): unrecognized EDID Header.
As such I am laden with the default 1024x768 resolution, better than nothing yes, but real hard on my eyes. I have consulted the TV’s Manual and the highest (and recommended) resolution is a respectable 1280x768 @ 60Hz. I have tried to play around with xrandr but this seems to result in an error no matter what I try.
This is a relatively fresh install of OpenSUSE 11.4 coming back after trying 12.1 M1. The nvidia drivers were installed via the suggested one-click method.
Excuse my waffling on, any help with this will be greatly appreciated, I will provide any other logs etc.
Thanks for the update Tom. I’m sure this will help others with nvidia hardware and EDID issues. There is development of SaX3 currently, which may provide a graphical means of doing the same thing. (At least I hope it will).
Thanks deano_ferrai for your support in Graphical resolution issues in MANY threads in addition to this one. MOST appreciated.
I also have hopes for sax3, but I am concerned it may not have all features that are not too difficult to implement as the coder (of sax3) may not know of all the solutions users like yourself understand. Unfortunately I know of no way to feed such suggestions to the coder. Nor do I know if they are even interested in learning of the resources we have of users who may be able to help with ideas/possible solution methodology (that may or may not be codeable).
and left a message, which I hope Manu Gupta will read. I know that there are many tricks/tips concerning X.Org configuration, some of which are driver specific. However, I’m sure that whatever stage this code gets to, this will provide a useful framework on which to build if necessary.
Nor do I know if they are even interested in learning of the resources we have of users who may be able to help with ideas/possible solution methodology (that may or may not be codeable).
Most of the config tweaks are fairly easily code-able, however because of the ever changing development of the graphics dirvers, the config solutions can vary accordingly.
Great news and I wish you all the best in your efforts. Deano_ferrai and also please_try_again are pretty good at doing this sort of tuning, so they are likely the users whose threads providing help will save you the most time in monitoring.
Again, good luck in your efforts. Many of our users are excited about this.
I’m glad to see that this thread is still getting attention, as I have another problem ha ha. Okay, so I have sorted the resolution problem by using the cvt util to create a modeline and then manually adding it to the xorg.conf I kindly asked nvidia-settings to generate for me as suggested by deano. However, even though I am using the recommend refresh rate of 60Hz it is incredibly fuzzy and seriously strains my eyes whilst reading. However, if I change the refresh rate to a lower 50Hz all is beautiful, apart from the missing section from the left hand side. My question is this, are there any tools that enable me to compensate for this, other than nvidia’s overscan? Which only seems to hurt more.
I’d like to say that I, myself am looking forward to SaX3, as I loved SaX2 as it was a major part of me getting used to Linux back when I was 14/15, I thought it was marvelous that OpenSUSE 10.3, particularly the SLICK variant came with everything, including a tool to get your screen looking nice and tidy.
However, even though I am using the recommend refresh rate of 60Hz it is incredibly fuzzy and seriously strains my eyes whilst reading. However, if I change the refresh rate to a lower 50Hz all is beautiful, apart from the missing section from the left hand side.
I don’t really understand this, as refresh rate will not contribute to ‘fuzzy’ viewing. However, maybe the TV does some virtual scaling (effective display mode change), when not driven at the correct refresh rate. The display mode (resolution) is usually of prime importance here. For LCD/LED/Plasma technologies, operating at the native resolution is key to a sharp image. Dropping to lower resolutions can cause aliasing, which results in a less sharp image. Generally, any modeline will only be used if it is valid for both graphics card and monitor, and if you have the modeline reference name in the screen section to match.
My question is this, are there any tools that enable me to compensate for this, other than nvidia’s overscan? Which only seems to hurt more.
For the proprietary nvidia driver, it is best to use nvidia-xconfig and/or nvidia-settings utilities to make any config changes. The hand-editing of config files is usually only used as a last resort.
Thanks again for the help. What I’m basically saying is that the recommended settings from the manual for resolution and refresh rate are 1280x768 @ 60Hz. However, running the monitor at 60Hz results in a VERY fuzzy image, I can’t find any settings at all to change anything to do with PC Input on the monitor. However, when run at the same res at a lower 50Hz everything is sharp and crisp. but the image moves over to the left and cuts off some of the left hand side of the screen image.
Any further help with this is greatly appreciated.
The offset you’re describing at 50Hz will be down to the hardware (TV) concerned. I’m not sure that a simple modeline adjustment will help here. In the old days with CRT monitors, one could use ‘xvidtune’ to adjust the timings to move the image left or right for example. From that one could tweak the modeline to provide the best image. (For LCD and plasma screens, this is not really applicable AFAIU). I’m not sure why the 60Hz refresh rate does not offer a clear image either. Sometimes, examining /var/log/Xorg.0.log is worthwhile to see what modes are detected and any errors reported during the detection.
Have a look at the output of ‘xrandr’ to see what other modes (if any) are available. Are you sure that 1280x768 is the native resolution for your TV, and not something higher?
Thanks for the info and for the further help. I am a hundred percent that this is the resolution, unless Daewoo have their documentation wrong, which wouldn’t surprise me at all. It is clearly listed in their manual as 1280x768 at a refresh rate of 60Hz, it’s most strange considering it runs my Xbox 360 at a nice 720p res beautifully.
I;m going to risk a 1366x768 push, but I doubt it.
It is clearly listed in their manual as 1280x768 at a refresh rate of 60Hz
That makes sense. For that, you could try two different ‘cvt-generated’ modelines. Using the ‘r’ switch (at 60Hz), you can get a ‘reduced blanking’ modeline, more suitable for modern flat panel displays
Sorry for my late reply, Call of Duty was calling me ha ha. Thanks very much for the reduced blanking mode, something I will try, I think it should help as it is unfortunately an old beast, but the colour definition is amazing.
Thank you for your help, I will post back with results.
as you suggested, however, just like you said it really didn’t play nice, fonts and the overall picture was excellent however it for some reason didn’t fit to the screen properly. Most strange.
In the manual provided by the manufacturer I have an H Frequency listed as 47.70 Khz, is there anyway to set this manually, as this may help? I really need to get a new TV but until I have the cash I may as well make the best of what I’ve got.
Thank you so much for your continued efforts deano, I appreciate them greatly,
I don’t know if having the HSync and VertR rates specified here will help but it seems to work for my monitor, which I spent quite a while getting sorted. Nvidia-settings is clueless and virtually useless.
Try getting those rates from cvt and entering them, worth a try anyways!