I have a 32-bit laptop that I hadn’t used in several days and this evening was a little surprised to find that I had 1000+ Tumbleweed updates to install. (Yikes! Another TeX/LaTeX mass update?) Even more surprising, though, were the repeated segmentation faults that were occurring as I ran “zypper dup”. Occasionally, zypper would simply hang when beginning the retrieval of the next package. (The cutesy status message would hang at “Starting” and the CPU would peg at 100% until I Ctrl-Ced back to a command line.) I was lucky to get 10-20 packages retrieved before zypper failed with the “Segmentation fault” error—most of the time it was failing after only grabbing 3-4. I could restart the command and it would skip past all the packages that were already in the cache but I never got them all retrieved. I was not keen on recalling that command line a few hundred times just to get the packages downloaded let alone the number of times – and hours – it would have taken to get them all installed. I wound up downloading the 32-bit ISO for 20191119 and performing the update that way. Once that was complete, I found there were 220 or so additional updates that I was able to successfully install via “zypper dup”. The version of zypper installed from the ISO image was 1.14.32. *Were there known problems with the previous 32-bit version of zypper?
*I have a 64-bit system that’s running with zypper 1.14.32 and it’s due to update that to 1.14.32-1.1. I would normally interpret that as being slightly newer code (security/bug patch?) than what just got installed on the 32-bit laptop (assuming that the 32-bit and 64-bit code bases maintain the same version numbering). *Should I be worried about that system potentially having trouble retrieving and installing updates?
*
TIA…
How much memory in the 32 bit.With that many updates you could have just run out.
2GB of main memory plus 2GB of swap. Plenty of free space on the root partition, too (~60-some GB free space).That was enough to handle an update session involving 2500+ packages a month or so ago. Surely zypper wasn’t designed to load everything into memory before installation.
It’s probably this bug: https://bugzilla.opensuse.org/show_bug.cgi?id=1156481
If it is, you’ll have to manually download the updated curl/libcurl4 packages and then use rpm -U to install them.
Mark
Interesting. I have a laptop with 16GB of RAM running Tumbleweed and I was also experiencing repeat segmentation faults running zypper with that same snapshot. I had plenty of free memory, swap and disk space. For me, however, it appeared to have something to do with network because it would segfault while trying to fetch packages. Once it FINALLY finished downloading all the packages the update process completed without issue. I’m running Network Manager and wondered if it was causing some kind of issue that was tripping zypper up.
… so yeah, the curl bug in my case makes sense.
Quite possible that that’s the culprit. I saw that two curl-related packages were being affected by one set of updates.
I’ll check that bugzilla submission more closely and see if my 64-bit system that due for some updates is affected by that.
Thanks for the pointer to that.
I thought the same thing for a time because we thought we’d seen a short term glitch that clobbered the streaming TV we were watching earlier in the evening. Nobody was experiencing any problems with internet access when my update retrieval problems were occurring, though. And I had no trouble at all pulling down the 20191119 ISO image. Of course, today Comcast went down hard for an hour or so. No way to tell if the two are related in any way and trying to get information like that out of Comcast would almost certainly be a waste of time.