How to correct font size display for a multi-monitor configuration?

I am trying out a “TwinView” configuration which was created with the tool “nvidia-settings”. The current setting works to some degree on my openSUSE 13.1/Tumbleweed system.

Unfortunately, I stumble on another annoyance when I would like to start a different desktop session. The log-in dialogue is presented by the software “KDM 4.11.10-4.6”. But the contained text is displayed in such a small font size so that it is almost unreadable. :open_mouth: The font size stays so small also in other desktop environments like “XFCE 4.10.1” and “LXDE 0.5.5” to which I can switch to from the running session “KDE 4.13.2”.
How should this display situation be improved?

You should ask tumbleweed questions on the tumbleweed sub-forum

Apparently the selected DPI value is not correct.
Normally this is calculated from the screen size and the resolution.

You can tell the driver to take it only from one specific display, or you can set a specific value in xorg.conf (96 x 96 should be a good default).

See also here:
http://http.download.nvidia.com/XFree86/Linux-x86/304.121/README/dpi.html

I guess that the involved software versions are similarly affected and independent from the aspect if they were installed from a Tumbleweed repository (or not).

tumbleweed is enough different from the standard that the problems may be different no matter what app.

I would appreciate if appropriate settings could be better determined around the technical detail “Dots Per Inch”.

I would not know, because I do not use Tumbleweed and thus I do not feel qualified answering.
You Tumbleweed friends are in the Tumbleweed forum, bit your thhread may gounnoticed by them.

gogalthorps’s remark is to point you to the place where yoy may get better answers. He is trying to help you.

Well, read the README.
It explains how the nvidia driver calculates the DPI, and what you can do to influence that.

The driver does what it does, not what you would appreciate maybe.

There are more software components involved. The current special display result might also not fit to my expectation because of a challenging combination of technical device properties. I am curious if the support for such a mixture can be improved.

No, there are not.

The font size in pixels is calculated from the configured font size in points and the DPI value, nothing else.
So if the DPI value is too large, your fonts will be too big, and if the DPI value is too low, the fonts are too small.

The current special display result might also not fit to my expectation because of a challenging combination of technical device properties. I am curious if the support for such a mixture can be improved.

Well, the nvidia README file explains it.
You can tell the driver to only respect one particular display, or you can explicitely set a specific DPI value.

OTOH, I’m not really sure I understand completely what you mean with your sentences.
What “challenging combination of technical device properties” or “mixture” that “can be improved” are you talking about?

  • kernel modules
  • display managers
  • X servers
  • corresponding software libraries

The font size in pixels is calculated from the configured font size in points and the DPI value, nothing else.

I imagine that a few more variables might matter for the appropriate application around the property “dots per inch”.

What “challenging combination of technical device properties” or “mixture” that “can be improved” are you talking about?

My monitors are really different. I do not try to combine identical models for an extended desktop at the moment.

Those are not relevant for your font size.
Of course, you have configured a font size in points (for the Login Screen and the Desktop environment). The actual font size in pixels (i.e. how large/small the font really is on the screen) is calculated by using the DPI value.
Period.

I imagine that a few more variables might matter for the appropriate application around the property “dots per inch”.

No.
The screen has one DPI value.
This is used to calculate the font size in pixels.

My monitors are really different. I do not try to combine identical models for an extended desktop at the moment.

Again, that README file should tell you everything that’s possible.

If you have one X display spanning over two screens, there is no way to have two different DPI values on each of the screens AFAIK.

Apparently one of your displays (the one that the driver chooses for the SPI calculation) reports a wrong size and that’s why your fonts are too small.
Set a higher DPI manually, or tell the driver to use the other display, and the fonts should be ok.

If you don’t want to accept that, well, it’s your problem.
I cannot help you any further.

One quote from the nvidia README though, as you don’t seem to want to read it yourself:

The DPI of an X screen can be poorly defined when multiple display devices are enabled on the X screen: those display devices might have different actual DPIs, yet DPI is advertised from the X server to the X application with X screen granularity. Solutions for this include:
[ul]
[li]Use separate X screens, with one display device on each X screen; see Chapter 15, [i]Configuring Multiple X Screens on One Card[/i] for details. [/li]> [li]The RandR X extension version 1.2 and later reports the physical size of each RandR Output, so applications could possibly choose to render content at different sizes, depending on which portion of the X screen is displayed on which display devices. Client applications can also configure the reported per-RandR Output physical size. See, e.g., the xrandr(1) ‘–fbmm’ command line option. [/li]> [li]Experiment with different DPI settings to find a DPI that is suitable for all display devices on the X screen. [/li]> [/ul]

We have have got different expectations for the handling of the “X screen granularity”. The RandR X extension (version >= 1.2) might offer another possibility.

If you don’t want to accept that, well, it’s your problem.

I can also look a bit more at the consequences from the configuration variant “Multiple X screens on one graphic card”.

We?
I don’t have any expectations at all regarding this.

And actually this is not about expectations, but about technical possibilities, no?

The RandR X extension (version >= 1.2) might offer another possibility.

According to the nvidia README, yes.
So have a look at “man xrandr”.

I can also look a bit more at the consequences from the configuration variant “Multiple X screens on one graphic card”.

Yes.
With two separate X screens it should be possible to have different DPI values.

But as I wrote already, I cannot help you any further, especially as you don’t seem to be satisfied with setting one specific DPI value for both monitors.
I only have (and ever had) single monitor systems.

On 07/11/2014 12:56 PM, elfring pecked at the keyboard and wrote:
> gogalthorp;2653479 Wrote:
>> You should ask tumbleweed questions on the tumbleweed sub-forum
> I guess that the involved software versions are similarly affected and
> independent from the aspect if they were installed from a Tumbleweed
> repository (or not).
>
>

You can always ask your question without mentioning tumbleweed if you
truly believe the problem is NOT tumbleweed related, that way all of the
people here that fear tumbleweed won’t tell you to bugger off to the
tumbleweed list.