Thanks for your patience - and for the formatting tips. Let’s see whether I can get it right this time.
You were right about the mismatch between the xrandr --newmode and --addmode statements. I had to change the underline character which was rejected by xrandr --newmode. However the outcome was the same:
You may have a point about the cable, but it is exactly the same one I used with the previous GT9400 card. Do you think it might be that the GT1030 would not be happy with the cable that worked with the GT9400?
I’m sorry you have been getting 404 errors with susepaste. This is exactly what I was experiencing when trying to use the web interface. I have pasted the file on pastebin. I can’t see how to send you a link to it but the title is Xorg.0.log.
When did you get the 2560x1440 display? Could it be you were running it with the GT9400 in 1920x1080 mode? Try verifying which type DVI cable you have on Digital Visual Interface - Wikipedia . If your cable is a dual link type, possibly the GT1030 is defective.
I’m sorry you have been getting 404 errors with susepaste. This is exactly what I was experiencing when trying to use the web interface. I have pasted the file on pastebin. I can’t see how to send you a link to it but the title is Xorg.0.log.
When you paste there it should present you a link to copy and paste for sharing here.
As follow-up to comment #4, and if your cable is a dual link type, try removing all xorg.con* files if you have not already, then saving the following as /etc/X11/xorg.conf:
When did you get the 2560x1440 display? Could it be you were running it with the GT9400 in 1920x1080 mode? Try verifying which type DVI cable you have on Digital Visual Interface - Wikipedia . If your cable is a dual link type, possibly the GT1030 is defective.
I got the monitor, GT9400 card and cable in 2011 and have always run it with nvidia drivers until the recent kernel change forced me to use the nouveau driver. It has always been run at 2560 x 1440. I checked with the wiki page you referenced and also found my original order for a dual-link cable. I suppose there is a chance the new GT1030 card could be faulty but I would like to exhaust other possibilities first.
When you paste there it should present you a link to copy and paste for sharing here.
No link was evident, but when I went to “My Pastes” I found I could right-click and “Copy link location”, so here it is: Xorg.0.log - Pastebin.com
As follow-up to comment #4, and if your cable is a dual link type, try removing all xorg.con* files if you have not already, then saving the following as /etc/X11/xorg.conf:
There was no existing xorg.conf file, so I added your lines and created one, but it made no difference to the problem. Thinking about it, when I made the swap of video cards I had been using the nouveau driver with the old one at 2560 x 1440 as usual. The computer booted up OK with the new card, but I’m fairly sure it was at 1680 x 1050 resolution despite no change to the driver. I must confess I paid little attention to that as I was intent on installing the nvidia driver, being aware of the warning about using the nouveau driver with KDE and the newer nvidia graphics adapters. I don’t know whether this is significant.
Perhaps I should try removing all the nvidia files and reverting to nouveau to see whether I can get the graphics operating at the correct resolution. If not, I suppose it does point the finger of suspicion at the GT1030 card.
(II) modesetting: Driver for Modesetting Kernel Drivers: kms
(II) UnloadModule: "modesetting"
(II) Unloading modesetting
This modesetting is the upstream default DDX, the only one I purposely use with my PCIe NVidia GPUs.
Perhaps I should try removing all the nvidia files and reverting to nouveau to see whether I can get the graphics operating at the correct resolution. If not, I suppose it does point the finger of suspicion at the GT1030 card.
Yes, and when you do that, you can also give the modesetting DDX a try. It can be employed either by specifying it as the device driver in an xorg.con* ‘Section “Device”’, or by simply uninstalling xf86-video-nouveau. Both it and the nouveau DDX depend on the same nouveau kernel module that most NVidia drivers require be blacklisted.
Do you have an HDMI cable to try? It could be only the DVI port on the card is defective. With the NVidia drivers removed and un-blacklisting nouveau, you could also try putting the GT9400 back in to ensure the DVI cable didn’t loose a connector when you made the card switch.
According to the Dell PDF, your monitor shipped with a dual link DVI cable.
This whole thread seems like dejavu of another thread here less than a year ago that ultimately had a head scratching simple solution.
Well, it’s been an interesting afternoon. Here’s the story.
Remove nvidia driver and reboot. Screen apparently at high resolution but unstable and barely readable
Delete “2560x1440” from boot options - no difference
<Ctrl><Alt>F1 and login as root. In YaST remove kernel-firmware-nvidia. Reboot - no X
<Ctrl><Alt>F1 and login as root. In YaST, install xf86-video-nouveau. Reboot - X window OK but resolution limited to 1680x1050 (Settings -> System Settings -> Display and Monitor)
YaST Change Boot Loader Kernel Parameters Console Resolution from 2560x1440 to Autodetect by grub2. Reboot - no change
Uninstall xf86-video-nouveau. Reboot - as before, no X
Try HDMI output - OK and max resolution of 1920x1200 possible via System Settings
Swap back to DVI output - no X
Exchange GT1030 card for old GT9400 - working fine at 2560x1440, xf-video-nouveau still not installed.
From these tests, does it appear that the GT1030 card is faulty? The GT1030 was recommended by nvidia customer support as a replacement for the GT9400 and the specification is more than adequate.
This whole thread seems like dejavu of another thread here less than a year ago that ultimately had a head scratching simple solution.
I’ll certainly search for that - I like simple solutions.
You certainly seem to have acquired a defective GT1030. If you haven’t exhausted the free return period at place of purchase, I would return it, not exchange it for a non-defective one. That would give you time to evaluate whether performance of the GT9400 using either of the FOSS drivers is adequate to your needs. Not needing the proprietary driver makes TW’s frequent kernel upgrades a non-issue, same as most FOSS software, and AMD and Intel GPU users’ zypper dup experience. If performance falls short of your expectations or needs, you can buy another, or a different model. I would buy one with at least one DisplayPort output. They seem to give the least trouble, the broadest capability, and like HDMI, provide audio to the display’s speakers through the video cable if desired.
You certainly seem to have acquired a defective GT1030.
Thank you for your diagnosis and for all the time you and others have so kindly put into this thread. If it’s any consolation, your suggestions have certainly been of great value to me as a learning exercise as well.
I have been using nvidia graphics cards for quite a long time. I would love to use a card which works with native drivers, but it’s very difficult to find decent ones. To be fair, up to now I have only had to do a couple of rollbacks and wait for a day or two for the drivers to catch up with a kernel upgrade, even on TW.
The old nvidia graphics card has served me well, but I have never been able to use FOSS drivers. A few years back the performance was so bad if I dragged a window across the screen it went in a series of jumps. More recently the performance with the Nouveau driver has been almost acceptable, but when doing some streaming in HD I found the speed was not up to the job.
I don’t need a very powerful card as I only do 2-D stuff and no gaming, so my ideal spec is reasonable 2-D performance with as low power consumption as possible. I will do a bit more research as the GT1030 can still be rejected as not fit for purpose if it does turn out to be faulty.
Once again, many thanks to you all for your efforts.
Having done further investigation I find the one thing I did not check, because nvidia recommended this card as a replacement for the GT9400 when I had given them all my requirements, was that the card appears to be Single Link-DVI, despite having a Dual Link connector. I am very annoyed and ashamed to have put you to all this trouble for something so obvious.
Please accept my apologies for this. I will now have to specify a different card and the way I feel, it’s unlikely to be nvidia.
I am really really surprised that NVidia would currently produce, or its support recommend, a card that doesn’t support 4k or even 2k. I doubt there’s been an AMD or Intel GPU shipped in the past 8-10 years that does not. IME, even all the AMD IGPs do 4k on at least one connector if not all, and I’m pretty sure the DisplayPort standard since at least 1.2 includes 4k as a minimum. That could well be why the GT1030 has no DisplayPort.
Before you spend money again, give the GT9400 with the modesetting DDX a good try. Once the NVidia driver is eradicated, simply zypper rm xf86-video-nouveau and restart X or reboot. From #9 in comment #28 it appears this is already the current state. If with modesetting DDX you determine it’s still too slow as with the nouveau DDX, then comes the time to shop for another card.
Don’t be sorry for coming here for help. Most of us regulars are here to solve problems whatever the cause. We want openSUSE users to be happy users. For me, problem solved is thanks enough.
You guys are far too kind. The world will be grateful to hear I don’t get myself into trouble all that often, so I tread fairly lightly over the various forums as a rule. I am always very grateful for the time you people spend getting the likes of me sorted out, so I’m very annoyed with myself when I fail to check things properly. I have to say this forum is very active and I have never before experienced such a speedy and helpful response to a problem.
Your point about the old video card is well made and I will look afresh at the situation with modesetting DDX (which is indeed the current setting). It seemed to me that the odd bit of streaming I do was just slightly jittery. In the past, the performance of this card without the nvidia drivers was really bad, so that has probably coloured my attitude.
Sadly however, there is another reason I will need a new card. A couple of months ago, the fan started to get very noisy. I managed to inject enough oil into the bearing to quieten it again, but the card is 9 years old, so probably won’t last much longer. Like you, I was surprised that nvidia still produce and recommend a card which cannot support at least UHD, but it is now also getting much rarer to find one with DVI output. My monitor is the same age as the old video card and has no DisplayPort connector, so it would be a huge additional expense to replace that if I did not have DVI output.
The market is very heavily weighted toward gaming, but apart from the occasional streaming, the biggest demand on my graphics is editing photos, which I spend a lot of time doing. I therefore don’t need monster graphics cards; it’s not so much their cost as the enormous power they need. My computer is generally on for 16-17 hours per day and I’m trying to keep my carbon footprint to a reasonable level. Having had a look at what’s available, I am leaning toward the AMD Radeon RX550 which as far as I can see is supported with FOSS drivers. The nearest nvidia equivalent which does have dual link DVI output is the GTX 1050, which has much better performance (which I don’t need) but will present the same proprietary driver problem.
You have a very happy, if rather abashed openSUSE user here!
Thanks for the suggestion - I’ll check out replacement fans.
Does your PC or motherboard model have onboard video with DVI out?
Maybe with a 9 year old PC it would be time to think about a whole system upgrade with integrated video, a lot more energy efficient?
The PC does not have any on-board graphics, but I hesitate to consider a complete system upgrade. The CPU is a 4-core Intel i5 2.67GHz, although I have no idea which generation. It is provided with 16GB of DDR3 RAM and an SSD system disc. With openSUSE Tumbleweed KDE it does seem fast enough for my purposes. I have also changed the original power supply for a much quieter one. It’s hard to see the cost of replacing that lot, or even just the motherboard, CPU and RAM, would be worthwhile as things stand.
I have turned out to be wrong on a couple more things. Looking hard at the video performance while streaming, I believe the graphics system may be coping, although I don’t have anything to compare it with. The glxgears test, for what it’s worth, is fairly smooth at 60fps although it shows some slight line tearing at full screen. In addition I have just discovered that the monitor does have a DisplayPort connector after all, which takes the pressure off buying a new graphics card while they are still obtainable with DVI outputs.
Exactly which model i5 do you have? If it provides a GPU, then simply a motherboard replacement, using your existing DDR3 and CPU, could be made for as little as half the price of a GFXcard:
If it provides a GPU, then simply a motherboard replacement, using your existing DDR3 and CPU, could be made for as little as half the price of a GFXcard:
… but probably more than the cost of a replacement fan!
In fact, on this side of the pond, the cost of a 2GB AMD Radeon RX550 is a little less than that of a suitable replacement motherboard, so it doesn’t seem worth the hassle. The RX550 has a power requirement of just under 50W when working flat out and 7W idling, which it would be doing most of the time.
I think if I were to consider replacing the MB I would be looking to do a serious upgrade to modern hardware, but it doesn’t seem necessary at the moment.
Thanks all the same, for your constructive suggestions.
Last time I bought a brand new PCIe graphics card its max power requirement was 19W. I try hard to avoid GPUs that need active cooling. IIRC, I have one with active cooling in service, and one on the shelf. The rest are all passive, or on the CPU die.
Your .sig doesn’t provide any means to know whether there is pond or mountain or anything else in between there and here.
Hi
The Zotac GT710’s (passive cooling) I have run at 25W probably less since I only use cuda cores for two of them (PCIeX1 versions), the other is in a PCIeX4 slot for gpu passthrough and doesn’t seem to have any power issues. I am looking at putting a Zotac GT1030 (passive cooling) instead for a few more cores.