"White dots" on screen

I’m having a very odd issue with openSUSE (well, Linux in general) on one of my laptops and I’m hoping someone has an idea of how to resolve this or troubleshoot it further. I am running openSUSE on a Dell G7 7588 (i7-8750H, 8GB of RAM and Nvidia GTX 1060/Intel UHD 630 Optimus graphics). I’m currently running Tumbleweed with KDE, but I have seen the issue on other distributions live systems (Ubuntu/Kubuntu, Manjaro Xfce).

The issue I am experiencing is “white dots” moving around my screen. I have not been able to take a video where they actually show up since they are rather dim, but essentially it looks like individual pixels are turning white for a fraction of a second. It’s similar to this video, but much, much less severe. It’s really only visible on a dark or black background and the pixels are not nearly as bright even with the backlight up all the way. The issue is not visible in grub, but as soon as the system starts loading (whether with plymouth enabled or disabled), even before X starts up, I can see the issue.

I’ve done some google’ing that has led me down a couple paths, but here’s what I have so far:

  • The system is not overheating - the GPU and CPU are both hovering around 50-55C with peaks around 65C, but the issue occurs consistently.
  • This occurs whether I’m using nouveau or nvidia, and whether I have suse-prime set to use the Intel or Nvidia graphics.
  • This seems to be a problem only with the internal display. I cannot capture it in a screenshot/video capture. I’ve connected an HDMI display, and even when mirroring the display I can see the problem on my laptop display but not on the external display.
  • The issue does not
    occur in Windows, so I do not think the cable or the internal display itself is failing.

Any suggestions as to what I can do to troubleshoot this further? Thank you!

Goggling “white dots nvidia bug” (and similar) turns up lots of results, indicative of an possible hardware/firmware issue that likely only nvidia can resolve realistically. That said, you mention that you’re using KDE. You could check if it is apparent with another desktop environment, and/or try changing the KDE rendering backend perhaps.

Apart from what deano_ferrari says, have a look at and try the compositor settings. You may need openGL 3.x

That’s also what I was referring to with respect to the rendering backend. :wink:

Thanks, both. As mentioned in my original post I have used Xfce and the issue still occurs there. It also occurs before X even loads, so it is not related to the compositor at all.

I dug through the first few results for “white dots Nvidia bug” and all the issues ended up being hardware related (bad cable or bad GPU). Given that the system still functions perfectly in Windows that is not the case here.

That leads me to a driver bug, but it’s occuring on both nouveau and nvidia (sure, that’s possible) and only affects the internal display and not an external display.

To further disprove it’s not a hardware fault, (although that’s very much what I would have suspected had you not said it doesn’t exhibit the problem when running MS Windows); if you go into the bios setup, and it’s not hardware related, then you shouldn’t see the “dots” there either. OTOH if they are present… then I guess you are looking at a hardware problem, display panel, driver/multiplexer hardware, video RAM…

Yeah, no issues not only in the bios setup (or UEFI, whatever) but not in grub either.

Since this appears to be an optimus based graphic. Have you installed either suse-prime or bumblebee to manage it??

Optimus is a strange setup the Intel GPU does all the output but the NVIDIA can optionally do rendering to pass through the Intel GPU. It definitively take special care and feeding because NVIDIA is stuck with old architecture in which part of the X stack is replaced which then interferes with other brand GPU process.

Yeah, I mentioned in my post that I am using suse-prime. The glitch happens both with it set to Nvidia or Intel.

I remember those glitches from way back when I tried to tune some of those CrazyDots/PixelWonder and Tseng ET-4000 videocards in my Atari Mega ST to the maximum, most flicker-free sync rates and pixel resolutions back in the 1990s, and later with Matrox-based Linux boxes (remember Matrox? Man, I’m old). Maybe with this problem, we have to go old-school again and analyse the video timings.

What are your numbers returned from xvidtune and the Xserver logs?

rig:~ ▶ **xvidtune -show**
"1920x1200"   154.00   1920 1968 2000 2080   1200 1203 1209 1235 +hsync -vsync

rig:~ ▶ **grep -i hz ~/.xsession-errors /var/log/Xorg.0.log**
/var/log/Xorg.0.log:     2.297] (--) NVIDIA(GPU-0): Eizo S2402W (DFP-0): 330.0 MHz maximum pixel clock
/var/log/Xorg.0.log:     2.297] (--) NVIDIA(GPU-0): DFP-1: 165.0 MHz maximum pixel clock
/var/log/Xorg.0.log:     2.297] (--) NVIDIA(GPU-0): DFP-2: 1440.0 MHz maximum pixel clock
/var/log/Xorg.0.log:     2.297] (--) NVIDIA(GPU-0): DFP-3: 165.0 MHz maximum pixel clock

That’s probably because GRUB re-uses a »safe« VESA timing provided by the computer’s BIOS — or one of the power-on-default timings of the video hardware, usually some 80×25 character mode. Plymouth then tries to set at the earliest point during boot the standard resolution you specified in the Xorg config — or, whatever YaST detected the safe and native resolution to be for your system. Maybe YaST detected something wrong, and now you have display noise. (Windows video drivers may play it safe and not drive the video hardware as close to its limit.)

Maybe, though, your X server tries to use refresh rates of 75Hz (for CRT monitors) instead of the LCD-typical 60Hz. Let’s fire up xvidtune again, but without parameters this time:

What does it say on lower-right, how many Hertz on your system?

(You can also try to play around further with these xvidtune controls, vary the timings slightly; whereas I don’t think this can damage modern video hardware anymore, you might end up with flicker or a garbled/black or otherwise unreadable screen. Therefor please save all work and issue a »sync« before experimenting with xvidtune.)


Sorry! I missed your reply. I rolled back to Windows to get everything working, but here’s what I’m seeing on the latest KDE Live disc:

xvidtune -show

"1920x1080"   138.70   1920 1968 2000 2080   1080 1083 1088 1111 +hsync -vsync
grep -i MHz Xorg.0.log 

   154.616] (II) modeset(0): clock: 138.7 MHz   Image Size:  344 x 194 mm
   154.616] (II) modeset(0): clock: 111.0 MHz   Image Size:  344 x 194 mm
   154.837] (**) NOUVEAU(G0):  Mode "1920x1080": 173.0 MHz (scaled from 0.0 MHz), 67.2 kHz, 60.0 Hz

I’m seeing:

Pixel Clock (MHz): 138.70
Horizontal Sync (kHz): 66.68
Vertical Sync (Hz): 60.02

So I’m not entirely sure what to do with that…

Neither am I, sorry; these timings seem well within the range of any graphics card and monitor sold during this decade. Safest and simplest options could be changing your monitor cable (HDMI for DVI, even VGA), replacing any video-cable adapters with ones from different vendors, if possible. If you use the Nouveau driver, try the original NVidia one instead (or vice versa) — see this wiki page on openSUSE.org.

If all else fails, you could try and do something that’s rarely needed nowadays: slightly adjust timings with xvidtune…

"1920x1080"   138.70   1920 1968 2000 2080   1080 1083 1088 1111 +hsync -vsync  *# your modeline*
"1920x1080"   148.5    1920 2008 2052 2200   1080 1084 1089 1125 +hsync +vsync  *# Others I found searching online for …*
"1920x1080"   173.00   1920 2048 2248 2576   1080 1083 1088 1120 -hsync +vsync  *# … "1080 modeline vesa"*

… until you see a clean image, and then copy the resulting modeline manually into xorg.conf.d, following countless descriptions online, or the technical description in the manual page:

man 5 xorg.conf.d

Further down the rabbit hole, there are descriptions using xrandr I just found in the SUSE support database.

I remember tweaking modelines for some ancient cards and monitors back in the 1990s. Later, in order to facilitate these settings, SuSE developed a tool called »SaX2«, and nowadays things should just work (as they usually do with Windows and macOS). And when they don’t work like in your case, things regrettably can get complicated quickly.

I forget a lot more as I get older, but not Matrox: 1004453 – [mga] no Xorg with Matrox gfxcard when valid vga= mode included on kernel cmdline to produce desirable vttys 823658 – matrox_w1 module blocks use of mga driver

If you use the Nouveau driver, try the original NVidia one instead (or vice versa)
Or neither, if by “Nouveau” you mean xf86-video-nouveau that provides the nouveau DDX, rather than the nouveau kernel driver. Try upstream’s default: modesetting, which should be used automagically if neither NVidia tainting nor upstream’s optional xf86-video-nouveau are installed. The problem here is more likely Optimus than a simple choice of DDX(s).

How is the external display connected, VGA, or digital? Internal is digital. If external is VGA, aka analog, it might explain the difference between internal and external. Please run

inxi -GxxSza

in Konsole and paste input & output here in code tags.