how can I enable direct rendering on OS 13.2 with my GTX 660?
is there at all the right module loaded?
if not: how can I achive that?
uname -a
Linux sonic 3.16.7-7-desktop #1 SMP PREEMPT Wed Dec 17 18:00:44 UTC 2014 (762f27a) x86_64 x86_64 x86_64 GNU/Linux
glxinfo | grep render:
direct rendering: No (If you want to find out why, try setting LIBGL_DEBUG=verbose)
OpenGL renderer string: GeForce GTX 660/PCIe/SSE2
GL_NVX_gpu_memory_info, GL_NV_blend_square, GL_NV_conditional_render,
GL_OES_fbo_render_mipmap, GL_OES_packed_depth_stencil, GL_OES_rgb8_rgba8,
202.205]
X.Org X Server 1.16.1
Release Date: 2014-09-21
202.205] X Protocol Version 11, Revision 0
202.205] Build Operating System: openSUSE SUSE LINUX
202.205] Current Operating System: Linux sonic 3.16.7-7-desktop #1 SMP PREEMPT Wed Dec 17 18:00:44 UTC 2014 (762f27a)
x86_64
202.205] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-3.16.7-7-desktop root=UUID=319b7ad5-1436-454d-8833-fabda9e56b1
3 resume=/dev/disk/by-id/ata-ST3500630NS_5QG17EMC-part1 splash=silent quiet showopts
202.205] Build Date: 18 December 2014 02:06:21PM
202.205]
202.205] Current version of pixman: 0.32.6
202.205] Before reporting problems, check http://wiki.x.org
to make sure that you have the latest version.
202.205] Markers: (--) probed, (**) from config file, (==) default setting,
(++) from command line, (!!) notice, (II) informational,
(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
202.205] (==) Log file: "/var/log/Xorg.0.log", Time: Wed Feb 4 13:06:23 2015
202.209] (==) Using config directory: "/etc/X11/xorg.conf.d"
202.209] (==) Using system config directory "/usr/share/X11/xorg.conf.d"
202.212] (==) No Layout section. Using the first Screen section.
202.212] (==) No screen section available. Using defaults.
202.212] (**) |-->Screen "Default Screen Section" (0)
202.212] (**) | |-->Monitor "<default monitor>"
202.213] (==) No monitor specified for screen "Default Screen Section".
Using a default monitor configuration.
202.213] (==) Automatically adding devices
202.213] (==) Automatically enabling devices
202.213] (==) Automatically adding GPU devices
202.222] (WW) The directory "/usr/share/fonts/misc/sgi" does not exist.
202.222] Entry deleted from font path.
202.223] (==) FontPath set to:
/usr/share/fonts/misc:unscaled,
/usr/share/fonts/Type1/,
/usr/share/fonts/100dpi:unscaled,
/usr/share/fonts/75dpi:unscaled,
/usr/share/fonts/ghostscript/,
/usr/share/fonts/cyrillic:unscaled,
/usr/share/fonts/truetype/,
built-ins
202.223] (==) ModulePath set to "/usr/lib64/xorg/modules"
202.223] (II) The server relies on udev to provide the list of input devices.
If no devices become available, reconfigure udev or disable AutoAddDevices.
202.223] (II) Loader magic: 0x80ec60
202.223] (II) Module ABI versions:
202.223] X.Org ANSI C Emulation: 0.4
202.223] X.Org Video Driver: 18.0
202.223] X.Org XInput driver : 21.0
202.223] X.Org Server Extension : 8.0
202.223] (II) xfree86: Adding drm device (/dev/dri/card0)
202.225] (--) PCI:*(0:1:0:0) 10de:11c0:1043:8422 rev 161, Mem @ 0xfb000000/16777216, 0xd0000000/134217728, 0xde000000/33554432, I/O @ 0x0000ef00/128, BIOS @ 0x????????/524288
202.225] (II) LoadModule: "glx"
202.227] (II) Loading /usr/lib64/xorg/modules/extensions/libglx.so
202.330] (II) Module glx: vendor="NVIDIA Corporation"
202.330] compiled for 4.0.2, module version = 1.0.0
202.330] Module class: X.Org Server Extension
202.330] (II) NVIDIA GLX Module 340.65 Tue Dec 2 09:10:06 PST 2014
202.331] (==) Matched nvidia as autoconfigured driver 0
202.331] (==) Matched nouveau as autoconfigured driver 1
202.331] (==) Matched nv as autoconfigured driver 2
202.331] (==) Matched nvidia as autoconfigured driver 3
202.331] (==) Matched nouveau as autoconfigured driver 4
202.331] (==) Matched nv as autoconfigured driver 5
202.331] (==) Matched modesetting as autoconfigured driver 6
202.331] (==) Matched fbdev as autoconfigured driver 7
202.331] (==) Matched vesa as autoconfigured driver 8
202.331] (==) Assigned the driver to the xf86ConfigLayout
202.331] (II) LoadModule: "nvidia"
202.331] (II) Loading /usr/lib64/xorg/modules/drivers/nvidia_drv.so
202.338] (II) Module nvidia: vendor="NVIDIA Corporation"
202.338] compiled for 4.0.2, module version = 1.0.0
202.338] Module class: X.Org Video Driver
202.339] (II) LoadModule: "nouveau"
202.339] (II) Loading /usr/lib64/xorg/modules/drivers/nouveau_drv.so
202.341] (II) Module nouveau: vendor="X.Org Foundation"
202.341] compiled for 1.16.1, module version = 1.0.11
202.341] Module class: X.Org Video Driver
202.341] ABI class: X.Org Video Driver, version 18.0
202.341] (II) LoadModule: "nv"
202.341] (II) Loading /usr/lib64/xorg/modules/drivers/nv_drv.so
202.342] (II) Module nv: vendor="X.Org Foundation"
202.342] compiled for 1.16.1, module version = 2.1.20
202.342] Module class: X.Org Video Driver
202.342] ABI class: X.Org Video Driver, version 18.0
202.342] (II) LoadModule: "modesetting"
202.342] (II) Loading /usr/lib64/xorg/modules/drivers/modesetting_drv.so
202.343] (II) Module modesetting: vendor="X.Org Foundation"
202.343] compiled for 1.16.1, module version = 0.9.0
202.343] Module class: X.Org Video Driver
202.343] ABI class: X.Org Video Driver, version 18.0
202.343] (II) LoadModule: "fbdev"
202.343] (II) Loading /usr/lib64/xorg/modules/drivers/fbdev_drv.so
202.343] (II) Module fbdev: vendor="X.Org Foundation"
202.343] compiled for 1.16.1, module version = 0.4.4
202.343] Module class: X.Org Video Driver
202.343] ABI class: X.Org Video Driver, version 18.0
202.343] (II) LoadModule: "vesa"
202.343] (II) Loading /usr/lib64/xorg/modules/drivers/vesa_drv.so
202.344] (II) Module vesa: vendor="X.Org Foundation"
202.344] compiled for 1.16.1, module version = 2.3.3
202.344] Module class: X.Org Video Driver
202.344] ABI class: X.Org Video Driver, version 18.0
202.344] (II) NVIDIA dlloader X Driver 340.65 Tue Dec 2 08:47:36 PST 2014
202.344] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
202.344] (II) NOUVEAU driver
202.344] (II) NOUVEAU driver for NVIDIA chipset families :
202.344] RIVA TNT (NV04)
... some graphic cards removed
202.345] (II) modesetting: Driver for Modesetting Kernel Drivers: kms
202.345] (II) FBDEV: driver for framebuffer: fbdev
202.345] (II) VESA: driver for VESA chipsets: vesa
202.345] (++) using VT number 7
202.346] (II) Loading sub module "fb"
202.346] (II) LoadModule: "fb"
202.346] (II) Loading /usr/lib64/xorg/modules/libfb.so
202.347] (II) Module fb: vendor="X.Org Foundation"
202.347] compiled for 1.16.1, module version = 1.0.0
202.347] ABI class: X.Org ANSI C Emulation, version 0.4
202.347] (WW) Unresolved symbol: fbGetGCPrivateKey
202.347] (II) Loading sub module "wfb"
202.347] (II) LoadModule: "wfb"
202.347] (II) Loading /usr/lib64/xorg/modules/libwfb.so
202.349] (II) Module wfb: vendor="X.Org Foundation"
202.349] compiled for 1.16.1, module version = 1.0.0
202.349] ABI class: X.Org ANSI C Emulation, version 0.4
202.349] (II) Loading sub module "ramdac"
202.349] (II) Module "ramdac" already built-in
202.350] (WW) Falling back to old probe method for modesetting
202.351] (WW) Falling back to old probe method for fbdev
202.351] (II) Loading sub module "fbdevhw"
202.351] (II) LoadModule: "fbdevhw"
202.351] (II) Loading /usr/lib64/xorg/modules/libfbdevhw.so
202.351] (II) Module fbdevhw: vendor="X.Org Foundation"
202.351] compiled for 1.16.1, module version = 0.0.2
202.351] ABI class: X.Org Video Driver, version 18.0
202.351] (WW) Falling back to old probe method for vesa
202.351] (II) NVIDIA(0): Creating default Display subsection in Screen section
"Default Screen Section" for depth/fbbpp 24/32
202.351] (==) NVIDIA(0): Depth 24, (==) framebuffer bpp 32
202.351] (==) NVIDIA(0): RGB weight 888
202.351] (==) NVIDIA(0): Default visual is TrueColor
202.351] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
202.352] (**) NVIDIA(0): Enabling 2D acceleration
203.043] (II) NVIDIA(0): Display (Idek Iiyama PLE511S (DFP-0)) does not support NVIDIA
203.043] (II) NVIDIA(0): 3D Vision stereo.
203.043] (II) NVIDIA(GPU-0): Found DRM driver nvidia-drm (20130102)
203.044] (II) NVIDIA(0): NVIDIA GPU GeForce GTX 660 (GK106) at PCI:1:0:0 (GPU-0)
203.044] (--) NVIDIA(0): Memory: 2097152 kBytes
203.044] (--) NVIDIA(0): VideoBIOS: 80.06.10.00.10
203.044] (II) NVIDIA(0): Detected PCI Express Link width: 16X
203.051] (--) NVIDIA(0): Valid display device(s) on GeForce GTX 660 at PCI:1:0:0
203.051] (--) NVIDIA(0): CRT-0
203.051] (--) NVIDIA(0): Idek Iiyama PLE511S (DFP-0) (boot, connected)
203.051] (--) NVIDIA(0): DFP-1
203.051] (--) NVIDIA(0): DFP-2
203.051] (--) NVIDIA(0): DFP-3
203.051] (--) NVIDIA(0): DFP-4
203.051] (--) NVIDIA(GPU-0): CRT-0: 400.0 MHz maximum pixel clock
203.051] (--) NVIDIA(0): Idek Iiyama PLE511S (DFP-0): Internal TMDS
203.051] (--) NVIDIA(GPU-0): Idek Iiyama PLE511S (DFP-0): 330.0 MHz maximum pixel clock
203.051] (--) NVIDIA(0): DFP-1: Internal TMDS
203.051] (--) NVIDIA(GPU-0): DFP-1: 165.0 MHz maximum pixel clock
203.051] (--) NVIDIA(0): DFP-2: Internal TMDS
203.051] (--) NVIDIA(GPU-0): DFP-2: 165.0 MHz maximum pixel clock
203.051] (--) NVIDIA(0): DFP-3: Internal TMDS
203.051] (--) NVIDIA(GPU-0): DFP-3: 330.0 MHz maximum pixel clock
203.051] (--) NVIDIA(0): DFP-4: Internal DisplayPort
203.051] (--) NVIDIA(GPU-0): DFP-4: 960.0 MHz maximum pixel clock
203.051] (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
203.051] (**) NVIDIA(0): device Idek Iiyama PLE511S (DFP-0) (Using EDID frequencies
203.051] (**) NVIDIA(0): has been enabled on all display devices.)
203.054] (==) NVIDIA(0):
203.054] (==) NVIDIA(0): No modes were requested; the default mode "nvidia-auto-select"
203.054] (==) NVIDIA(0): will be used as the requested mode.
203.054] (==) NVIDIA(0):
203.054] (II) NVIDIA(0): Validated MetaModes:
203.054] (II) NVIDIA(0): "DFP-0:nvidia-auto-select"
203.054] (II) NVIDIA(0): Virtual screen size determined to be 1280 x 1024
203.087] (--) NVIDIA(0): DPI set to (79, 83); computed from "UseEdidDpi" X config
203.087] (--) NVIDIA(0): option
203.087] (II) UnloadModule: "nouveau"
203.087] (II) Unloading nouveau
203.087] (II) UnloadModule: "nv"
203.087] (II) Unloading nv
203.087] (II) UnloadModule: "modesetting"
203.087] (II) Unloading modesetting
203.087] (II) UnloadModule: "fbdev"
203.087] (II) Unloading fbdev
203.087] (II) UnloadSubModule: "fbdevhw"
203.087] (II) Unloading fbdevhw
203.087] (II) UnloadModule: "vesa"
203.087] (II) Unloading vesa
203.087] (--) Depth 24 pixmap format is 32 bpp
203.087] (II) NVIDIA: Using 3072.00 MB of virtual memory for indirect memory
203.087] (II) NVIDIA: access.
203.090] (II) NVIDIA(0): ACPI: failed to connect to the ACPI event daemon; the daemon
203.090] (II) NVIDIA(0): may not be running or the "AcpidSocketPath" X
203.090] (II) NVIDIA(0): configuration option may not be set correctly. When the
203.090] (II) NVIDIA(0): ACPI event daemon is available, the NVIDIA X driver will
203.090] (II) NVIDIA(0): try to use it to receive ACPI event notifications. For
203.090] (II) NVIDIA(0): details, please see the "ConnectToAcpid" and
203.090] (II) NVIDIA(0): "AcpidSocketPath" X configuration options in Appendix B: X
203.090] (II) NVIDIA(0): Config Options in the README.
203.096] (II) NVIDIA(0): Setting mode "DFP-0:nvidia-auto-select"
203.155] (==) NVIDIA(0): Disabling shared memory pixmaps
203.156] (==) NVIDIA(0): Backing store enabled
203.156] (==) NVIDIA(0): Silken mouse enabled
203.157] (==) NVIDIA(0): DPMS enabled
203.157] (II) Loading sub module "dri2"
203.157] (II) LoadModule: "dri2"
203.157] (II) Module "dri2" already built-in
203.157] (II) NVIDIA(0): [DRI2] Setup complete
203.157] (II) NVIDIA(0): [DRI2] VDPAU driver: nvidia
203.157] (--) RandR disabled
203.162] (II) Initializing extension GLX
203.253] (II) config/udev: Adding input device Power Button (/dev/input/event2)
... some evdev/kbd/button-stuff removed
203.290] (II) config/udev: Adding input device HDA NVidia HDMI/DP,pcm=3 (/dev/input/event12)
203.290] (II) No input driver specified, ignoring this device.
203.290] (II) This device may have been added with another device file.
203.290] (II) config/udev: Adding input device HDA NVidia HDMI/DP,pcm=7 (/dev/input/event13)
203.290] (II) No input driver specified, ignoring this device.
203.291] (II) This device may have been added with another device file.
203.291] (II) config/udev: Adding input device HDA NVidia HDMI/DP,pcm=8 (/dev/input/event14)
203.291] (II) No input driver specified, ignoring this device.
203.291] (II) This device may have been added with another device file.
203.292] (II) config/udev: Adding input device HDA NVidia HDMI/DP,pcm=9 (/dev/input/event15)
203.292] (II) No input driver specified, ignoring this device.
203.292] (II) This device may have been added with another device file.
203.292] (II) config/udev: Adding input device HDA ATI SB Line Out Side (/dev/input/event10)
203.293] (II) No input driver specified, ignoring this device.
... some HDA ATI sound stuff removed
203.294] (II) config/udev: Adding input device HDA ATI SB Line Out CLFE (/dev/input/event9)
203.294] (II) No input driver specified, ignoring this device.
203.294] (II) This device may have been added with another device file.
203.294] (II) config/udev: Adding input device Logitech Unifying Device. Wireless PID:101a (/dev/input/event16)
... some mouse/keyboard-stuff removed
205.261] (II) NVIDIA(GPU-0): Display (Idek Iiyama PLE511S (DFP-0)) does not support NVIDIA
205.261] (II) NVIDIA(GPU-0): 3D Vision stereo.
206.401] (II) NVIDIA(0): Setting mode "DVI-I-1: 1600x1200 @1600x1200 +0+0 {ViewPortIn=1600x1200, ViewPortOut=1600x1200+0+0}"
...
I executed glxinfo as user in a terminal(konsole) within kde.
(same result whith su root)
is the following relevant?
glxgears:
Running synchronized to the vertical refresh. The framerate should be
approximately the same as the monitor refresh rate.
330 frames in 5.1 seconds = 64.893 FPS
320 frames in 5.3 seconds = 60.028 FPS
300 frames in 5.0 seconds = 59.987 FPS
300 frames in 5.0 seconds = 59.996 FPS
320 frames in 5.3 seconds = 59.989 FPS
320 frames in 5.8 seconds = 55.334 FPS
327 frames in 5.0 seconds = 65.369 FPS
^C
Again, you should not run it with su root. At least in earlier versions this did not even work at all.
is the following relevant?
glxgears:
Running synchronized to the vertical refresh. The framerate should be
approximately the same as the monitor refresh rate.
330 frames in 5.1 seconds = 64.893 FPS
320 frames in 5.3 seconds = 60.028 FPS
300 frames in 5.0 seconds = 59.987 FPS
300 frames in 5.0 seconds = 59.996 FPS
320 frames in 5.3 seconds = 59.989 FPS
320 frames in 5.8 seconds = 55.334 FPS
327 frames in 5.0 seconds = 65.369 FPS
^C
No.
This just means what it says. It’s running with vsync turned on, so the framerate is the same as the refresh rate of your monitor.
As vsync is turned ON by default, this is normal. You should be able to disable it in nvidia-settings I think.
same result
Hm? It should print some debug output if you set LIBGL_DEBUG=verbose, like this:
wolfi@amiga:~> LIBGL_DEBUG=verbose glxinfoname of display: :0
libGL: screen 0 does not appear to be DRI3 capable
libGL: pci id for fd 4: 1002:4150, driver r300
libGL: OpenDriver: trying /usr/lib64/dri/tls/r300_dri.so
libGL: OpenDriver: trying /usr/lib64/dri/r300_dri.so
libGL: Can't open configuration file /home/wolfi/.drirc: No such file or directory.
libGL: Can't open configuration file /home/wolfi/.drirc: No such file or directory.
display: :0 screen: 0
direct rendering: Yes
server glx vendor string: SGI
server glx version string: 1.4
...
is that somehow relevant:
in nvidia-settings -> X Screen 0 -> OpenGL settings -> image settings is at “Quality”.
is that correct? would changing it to “High Performance” or “High Quality” make a difference?
Ok. I tried on an nvidia system now, and LIBGL_DEBUG does indeed not work. Apparently nvidia’s libGL doesn’t support that, sigh…
is that somehow relevant:
in nvidia-settings -> X Screen 0 -> OpenGL settings -> image settings is at “Quality”.
is that correct? would changing it to “High Performance” or “High Quality” make a difference?
No idea. I don’t use nvidia myself.
I would think not, but you can try it if you want.
NVIDIA/nvidia-drivers - Gentoo wiki would suggest that the problem can be caused by the kernel’s DRM.
I never heard of such a problem on openSUSE (which has CONFIG_DRM=m, i.e. DRM is built as loadable module), but try to blacklist it.
I think creating a file in /etc/modprobe.d/ with the following content should achieve that:
blacklist drm
No guarantees though that this will help or your system will even boot afterwards… (if not, just remove that file again)
Another thing: Do you have an intel GPU as well? I.e. maybe built into your CPU.
On some systems this apparently might cause problems with the nvidia driver. (and no, I’m not talking about Optimus here… )
And try to uninstall libvdpau_va_gl1, this can cause problems with VDPAU when using the nvidia driver. Although I don’t think that should have any effect on direct rendering.
I have the same problem as OP, and nothing is working. I’ve tried everything but disabling drm (my blacklist attempts didn’t work, and the nvidia module depends on it anyways).
glxinfo stubbornly reports that dri is off, and one of my steam games just gives me a black screen (it works fine on my friend’s linux box).
Well, this is quite an old thread, and it is not at all sure that you have exactly the same problem.
And the OP never reported back if my last suggestions helped… :\
I’ve tried everything but disabling drm (my blacklist attempts didn’t work, and the nvidia module depends on it anyways).
What did you try to blacklist that nvidia depends on?
glxinfo stubbornly reports that dri is off, and one of my steam games just gives me a black screen (it works fine on my friend’s linux box).
Any new ideas? anyone?
Can you please post your output of glxinfo for a start please?
And more information about your system would probably be helpful too.
Like what graphics chips/cards, what driver you use, and in case of nvidia what exact nvidia packages you have installed.