Which proprietary Nvidia driver package do I need for Quadro FX 1500?

I searched the forum but did not find any related and up-to-date posts.

My objective is to install the Nvidia proprietary driver for the NVIDIA Corporation G71GL [Quadro FX 1500] (rev a1) following SDB:NVIDIA drivers - openSUSE Wiki.

I installed the Nvidia repo using YaST, and am now at “Install”, SDB:NVIDIA drivers - openSUSE Wiki. But, it is not clear to me which driver package I need.

None of the naming convention mappings listed appear to directly relate to my device. Nvidia’s driver search engine yields a driver build file with a version of 304.137 [Linux x64 (AMD64/EM64T) Display Driver | 304.137 | Linux 64-bit | NVIDIA]. That page indicates the driver supports the GeForce 600 Series. So, I think maybe I should use the “G05” package, but it is version 470.199.02 and not 304.137:

zypper se -s x11-video-nvidiaG0* nvidia-video-G06*
Loading repository data...
Reading installed packages...

S | Name                      | Type    | Version               | Arch   | Repository
--+---------------------------+---------+-----------------------+--------+------------------------
  | nvidia-video-G06          | package | 535.86.05-lp154.10.1  | x86_64 | nVidia Graphics Drivers
  | nvidia-video-G06-32bit    | package | 535.86.05-lp154.10.1  | x86_64 | nVidia Graphics Drivers
  | x11-video-nvidiaG04       | package | 390.157-lp154.30.1    | x86_64 | nVidia Graphics Drivers
  | x11-video-nvidiaG04-32bit | package | 390.157-lp154.30.1    | x86_64 | nVidia Graphics Drivers
  | x11-video-nvidiaG05       | package | 470.199.02-lp154.54.1 | x86_64 | nVidia Graphics Drivers
  | x11-video-nvidiaG05-32bit | package | 470.199.02-lp154.54.1 | x86_64 | nVidia Graphics Drivers

And, relying on YaST Software Management to recommend the appropriate package didn’t help. The install tip related at 1195885 – nvidia driver no longer works due to simpledrm enablement in Kernel 5.16.8 does not work for me. After following the steps, there are no nvidia packages preselected.

Can anyone confirm that I should use the ‘x11-video-nvidiaG05’ package (Wayland is installed, but I use X11)?

@tleedavidson your limited to nouveau, that card/driver went out of support a long time ago… You would need to get the 304.137 run file and patch yourself, but I doubt it would work…

Oh. Crap. That’s not exactly what I wanted to hear.

But, I appreciate your response and info considerably :slight_smile:

Notebook? Try to get a newer mxm graphics card.
Otherwise, Nvidia is no fun.
Desktop, get another replacement card, that thing is so old, even 4 years old graphics card do wonder.
The card is made 2006. Completely forget about it. Any modern integrated graphics is faster.
I just threw away my 2008 system. Way to big of difference.

1 Like

ATI Terascale 1 (Radeon HD 2000 series) from 2007 is OK for DE or browsers because of open drivers (OpenGL 3.3 support).
Nvidia chips after 10 + 2 years of support are rather useless.

It’s kind of funny that you should mention that, because I actually replaced a Radeon HD 2400 XT card with this one hoping to get, among other benefits, better performance. But, from purely subjective observation, I think the performance is indeed worse.

1 Like

The G71 FX1500 ought to be functional by default with FOSS. It’s the same 2006 age as this bargain basement C61 6150SE IGP:

# inxi -GSaz --vs --zl --hostname
inxi 3.3.28-00 (2023-07-10)
System:
  Host: mcp61 Kernel: 5.14.21-150400.24.74-default arch: x86_64 bits: 64
    compiler: gcc v: 7.5.0 clocksource: tsc available: acpi_pm
    parameters: root=LABEL=<filter> ipv6.disable=1 net.ifnames=0 noresume
    consoleblank=0 preempt=full mitigations=off
  Desktop: Trinity v: R14.1.0 tk: Qt v: 3.5.0 info: kicker wm: Twin v: 3.0
    vt: 7 dm: 1: TDM 2: XDM Distro: openSUSE Leap 15.4
Graphics:
  Device-1: NVIDIA C61 [GeForce 6150SE nForce 430] vendor: Micro-Star MSI
    driver: nouveau v: kernel non-free: series: 304.xx status: legacy (EOL)
    last: release: 304.137 kernel: 4.13 xorg: 1.19 arch: Curie
    process: 90-130nm built: 2003-13 ports: active: VGA-1 empty: none
    bus-ID: 00:0d.0 chip-ID: 10de:03d0 class-ID: 0300
  Display: x11 server: X.Org v: 1.20.3 driver: X: loaded: modesetting
    dri: nouveau gpu: nouveau display-ID: :0 screens: 1
  Screen-1: 0 s-res: 1680x1050 s-dpi: 108 s-size: 395x246mm (15.55x9.69")
    s-diag: 465mm (18.32")
  Monitor-1: VGA-1 model: Dell P2213 serial: <filter> built: 2012
    res: 1680x1050 hz: 60 dpi: 90 gamma: 1.2 size: 473x296mm (18.62x11.65")
    diag: 558mm (22") ratio: 16:10 modes: max: 1680x1050 min: 720x400
  API: OpenGL v: 2.1 Mesa 21.2.4 renderer: NV4C direct-render: Yes

I have an FX 3400, also Curie, that bad RAM has made worthless.

Oh, it is functional. If I gave the impression that it wasn’t working, I apologize.

It just seems to be less performant than the Radeon HD 2400 XT card. I do have the Mesa-dri-nouveau package locked because IIRC, during an install early in my Leap 15 journey, the installer informed me that driver could cause problems.

user@linux-desktop:~> zypper if Mesa-dri-nouveau
Loading repository data...
Reading installed packages...


Information for package Mesa-dri-nouveau:
-----------------------------------------
Repository     : Packman Repository
Name           : Mesa-dri-nouveau
Version        : 22.3.5-150500.76.pm.3
Arch           : x86_64
Vendor         : http://packman.links2linux.de
Installed Size : 24.4 MiB
Installed      : No
Status         : not installed
Source package : Mesa-drivers-22.3.5-150500.76.pm.3.src
Upstream URL   : https://www.mesa3d.org
Summary        : Mesa DRI plug-in for 3D acceleration via Nouveau
Description    : 
    This package contains nouveau_dri.so, which is necessary for
    Nouveau's 3D acceleration to work. It is packaged separately
    since it is still experimental.

I’m not a gamer, and I can’t think of any other reason why 3D acceleration would be a priority for me. The next time I have a need to open the system unit, I’ll probably switch back to the Radeon card.

I have a G84 Tesla, about 12 months newer technology than your Curie FX 1500, but about the same age as a HD 2400: a GeForce 8600 GT, installed in the following PC:

# inxi -CS
System:
  Host: big41 Kernel: 5.14.21-150400.24.81-default arch: x86_64 bits: 64
    Console: pty pts/0 Distro: openSUSE Leap 15.4
CPU:
  Info: dual core model: Intel Core2 Duo E7600 bits: 64 type: MCP cache:
    L2: 3 MiB
  Speed (MHz): avg: 1603 min/max: 1603/3066 cores: 1: 1603 2: 1603

It managed scores as follows running glmark2 on 15.4 with the Tesla G84:

564: 1920x1200 using nouveau DDX display driver
582: 1920x1200 using modesetting DIX display driver
***
# inxi -Gaz | grep -A2
API: OpenGL v: 3.3 Mesa 21.2.4 renderer: NV84 direct-render: Yes

Replaced with a Radeon HD 2400:

256: 1920x1200 using radeon DDX display driver
255: 1920x1200 using modesetting DIX display driver
***
# inxi -Gaz | grep -A2
  API: OpenGL v: 3.3 Mesa 21.2.4 renderer: AMD RV610 (DRM 2.50.0 /
    5.14.21-150400.24.81-default LLVM 11.0.1) compat-v: 3.0 direct-render: Yes

I’m skeptical that the difference between your FX 1500 and your HD 2400 would favor the Radeon. Maybe the difference in results between my two is I’ve never been able to discover how to get Radeons to run without involving LLVM. There’s more to how X works than I understand. :stuck_out_tongue:

You reporting your benchmark numbers sparked my curiosity. So I installed ‘glmark2’ and ran, according to the man page, the default benchmarks.

# inxi -CS
System:
  Host: linux-desktop Kernel: 5.14.21-150500.55.19-default arch: x86_64
    bits: 64 Desktop: KDE Plasma v: 5.27.4 Distro: openSUSE Leap 15.5
CPU:
  Info: 6-core model: AMD Phenom II X6 1045T bits: 64 type: MCP cache:
    L2: 3 MiB
  Speed (MHz): avg: 926 min/max: 800/2700 cores: 1: 800 2: 800 3: 1468
    4: 800 5: 893 6: 800
# inxi -Gaz
Graphics:
  Device-1: NVIDIA G71GL [Quadro FX 1500] driver: nouveau v: kernel non-free:
    series: 304.xx status: legacy (EOL) last: release: 304.137 kernel: 4.13
    xorg: 1.19 arch: Curie process: 90-130nm built: 2003-13 pcie: gen: 1
    speed: 2.5 GT/s lanes: 16 ports: active: DVI-I-2 empty: DVI-I-1,TV-1
    bus-ID: 04:00.0 chip-ID: 10de:029e class-ID: 0300 temp: 43.0 C
  Display: x11 server: X.Org v: 1.21.1.4 with: Xwayland v: 22.1.5
    compositor: kwin_x11 driver: X: loaded: nouveau
    unloaded: fbdev,modesetting,vesa alternate: nv,nvidia dri: nouveau
    gpu: nouveau display-ID: :0 screens: 1
  Screen-1: 0 s-res: 1280x1024 s-dpi: 96 s-size: 338x270mm (13.31x10.63")
    s-diag: 433mm (17.03")
  Monitor-1: DVI-I-2 model: ViewSonic VX900-2 serial: <filter> built: 2003
    res: 1280x1024 hz: 60 dpi: 86 gamma: 1.2 size: 376x301mm (14.8x11.85")
    diag: 482mm (19") ratio: 5:4 modes: max: 1280x1024 min: 720x400
  API: OpenGL v: 4.5 Mesa 22.3.5 renderer: llvmpipe (LLVM 15.0.7 128 bits)
    direct render: Yes
# glmark2
libGL error: MESA-LOADER: failed to open nouveau: /usr/lib64/dri/nouveau_dri.so: cannot open shared object file: No such file or directory (search paths /usr/lib64/dri, suffix _dri)
libGL error: failed to load driver: nouveau

I assume the above error is due to the fact that the ‘Mesa-dri-nouveau’ package is not installed.

I think every benchmark reported this error (possibly for the same reason?):
** GLX does not support GLX_EXT_swap_control or GLX_MESA_swap_control!
** Failed to set swap interval. Results may be bounded above by refresh rate.

glmark2 Score: 92

Based on your numbers, that is pathetic. I tried booting up after installing the ‘Mesa-dri-nouveau’ and had to remove it. The system locked up when logging in to the graphical destop.

And so, I put the Radeon card back in.

# inxi -Gaz | grep -A2 API
  API: OpenGL v: 3.3 Mesa 22.3.5 renderer: AMD RV610 (DRM 2.50.0 /
    5.14.21-150500.55.19-default LLVM 15.0.7) compat-v: 3.0 direct render: Yes

glmark2 Score: 314

Apparently the CPU is playing no small role in glmark2 with these old Radeons. I reran glmark2 and RV610 using a Dell 1280x1024 instead of my customary NEC 1920x1200. Neither score came close to your 314:
With DDX: up from 256 to 265
With DIX: up from 255 to 264

So, I moved my RV610 to a 9 year newer PC with 4 CPU cores and NVME storage, and with DIX score was 559 with 1280x1024, 558 with 1920x1200; and with DDX and 1920x1200 - also 558. Glmark2 reported for each:

many times, with a number of “GLX does not support” messages also.

So, I moved my RV610 to a 9 year newer PC with 4 CPU cores and NVME storage, and with DIX score was 559 with 1280x1024, 558 with 1920x1200; and with DDX and 1920x1200 - also 558. Glmark2 reported for each:

many times, with a number of “GLX does not support” messages also.

Very interesting. Glmark2 reported those failure messages only with the Nvidia card. The Radeon card caused no such messages and, in contrast, seems to run as smooth as silk.

Thank you for taking the time to run those tests and sharing your results. It gives me a rough idea as to how my machine/card is running, comparatively.

Another Glmark2 comparison for broader context: 2014 AMD A10-7850K/Radeon R7 APU, not discrete GPU:
DDX score: 2079
DIX score: 2077

No error messages.

2021 Intel i5-11400/Rocket Lake GT1 UHD 730 iGPU on 15.4 with DIX:
Score 4 weeks ago with 3 connected displays: 2702
Score today with 1 connected display: 3028
Score today with 1 connected display, after switch to Intel DDX: 33 (twice!)
Same PC 10 weeks ago using Mageia 9 with DIX: 2626
Same PC 10 weeks ago using Mageia 9 with Intel DDX: 5162
Same PC booted to TW20230730 today, with Intel DDX: 33
Same PC booted to TW20230730 today, with DIX: 3008
Same PC booted to Fedora 38 today, with Intel DDX: 5766

I’ve seen postings from NVidia users showing scores in 5 digits.

OT: Looks like there’s a bug with openSUSE’s Glmark2 when run with Intel DDX.