Unneeded packages

I know this has been asked in the past on various forums but I’m trying to find the most up-to-date answer!

When I used Ubuntu I used the command *apt autoremove *to remove all packages that are now redundant. I guess apt considers them as packages that have been installed as a dependency to a program a user installed but since removed.

I’m trying to find something similar with zypper.

I have read zypper packages --unneeded will do something similar but it seems to be missing some that I would have expected to be on the list.

ie. Yesterday I installed phoronix-test-suite, then I installed the test iozone. iozone needed a bunch of things that I didn’t have already… only two I which I can remember - gcc and make. I then uninstalled iozone and PTS, ran *zypper packages --unneeded *and gcc and make is not on the list and still installed. I know they were only installed for iozone which I don’t have anymore. Surely they are “unneeded”

Don’t get me wrong, I’m not offended by having gcc and make installed (I expected to have them already!) but with a 128GB SSD I would like to feel that I don’t have anything unnecessarily installed. It also makes me reluctant to try new software if I can’t remove everything should I uninstall.

Thanks for any help!

When I would be a computer software management software package and I would see gcc installed, I would think that the system owner is a developer and would never classify it as “unneeded”. I am of course much more stupid then a software management software package, but IMHO it your definition of “unneeded software packages” is probably different from zypper’s definition.

@OS499846:

With YaST “Software”, take a look at the “Patterns” – “make” is pulled in by the “Enhanced Base System” pattern – the “Software Management” pattern, needed for patching and updating, pulls in “zypper” which needs some “C”, “C++” and “gcc” libraries. I suspect that, “GCC” is being pulled by dependencies of those libraries and, occasionally, the Patching/Update processes, occasionally, have a need to make and compile code as part of the Patch/Update procedures …

Yes, I agree.

I’m not sure why it even matters. Today’s disk sizes are large enough that you don’t have to worry about every byte.

For me, the main thing important with the Ubuntu cleanup, was removing older kernels.

I am not aware that any RPM manager has a function that’s similar to “apt-get autoremove”
IMO for something like that to work, a database(might also be as simple as an XML file) has to be created, populated with every installation that tracks dependencies so that when you delete a package you can be sure it’s not still needed by another installed package and AFAIK that doesn’t currently exist except with recent aptitude (AFAIK this command didn’t exist a few years ago).

There are various zypper commands which supposedly identify orphaned packages but IMO those are likely other things that doesn’t match up exactly with verifying whether a dependency still exists for a package. And, my understanding and use of “clean” is wipe the package cache so that it can be re-built which again is something different.

This kind of command can be useful, lessening if not eliminating package crud which over time can slow a package search, or cause unnecessary disk space usage when disk space might be limited (eg distributable solutions like container/cloud apps. In fact, this is one of my main criticisms of the Azure platform, how you can’t avoid crud buildup from image modifications like common updates until you decide to refactor the image completely)

TSU

**@OS499846 **
Absolutely you should be wary trying things when you suspect that removal would leave untracked debris behind.

Some solutions

  • If you’re installed on BTRFS, you can rollback to before your experiment. But, since snapshots are typically based on an entire partition, you’ll roll back <every> change to your system since the snapshot was made. BTRFS supports “undeleting” individual files, but that won’t help you in this case. But, this solution can be considered if your experiment is only a few hours or a day at most or be willing to re-install everything else that might have been installed like system updates and other app install/deletes. Also, consider that by default a snapshot is made automatically before and after every libzypp operation (zypper or YaST) and system shutdown and boot.

  • Experiment in a Virtual Machine by installing something like Virtualbox. Then, your experiments are entirely disposable and do not affect your base system which can be kept pristine.

TSU

As both a compiler (not just libraries) and “make” were required, I tend towards this not being an RPM install, but a tarball compile.

There are various zypper commands which supposedly identify orphaned packages but IMO those are likely other things that doesn’t match up exactly with verifying whether a dependency still exists for a package. And, my understanding and use of “clean” is wipe the package cache so that it can be re-built which again is something different.

#  zypper rm  --clean-deps iozone 
       Automatically removes unneeded dependencies.

This function is also available in YaST, but is only available when removing .rpm packages. I have in the past reinstalled a package in order to remove it with this option.

One can search for orphans, which identifies packages that do not exist in an enabled repository. Searching for packages that are not required by others lists a lot of really useful software.

Of course. Many of those that are in the “top level” of the dependancy tree, thus not needed by other packages, are needed by the user as applications. That is more or less by design.

That’s an interesting command but works only for a specified package.
So, let’s say for instance that the package has already been removed leaving “orphaned dependencies.”
The zypper command won’t support removing the crud… Whereas the first time I ran “apt-get autoremove” on a system, it was my impression it did a fairly complete removal even for packages that hadn’t existed on the system for quite awhile (I’m hoping my “impression” wasn’t mistaken).

I wonder if people might want to prefer the behavior of “zypper rm -clean-deps package_name” over the behavior of the default “zypper rm package_name

TSU

Bandwidth waste at updates time. Time updating. Backup/restore time. Backup media size. Enough little pieces eventually add up to bloat that doesn’t fit the backup media.

Today’s disk sizes are large enough that you don’t have to worry about every byte.
It’s not an excuse to be wasteful either. Waste in one area here another there leads to a generally wasteful lifestyle & philosophy.

Hi
You can also look at the requires from;

https://build.opensuse.org/package/binary/benchmark/phoronix-test-suite/openSUSE_Leap_15.1/x86_64/phoronix-test-suite-7.8.0-lp151.30.1.noarch.rpm

https://build.opensuse.org/package/binary/benchmark/iozone/openSUSE_Leap_15.1/x86_64/iozone-3.483-lp151.18.3.x86_64.rpm

Compare to updating Tumbleweed, this seems minor.

Backup/restore time. Backup media size.

I don’t backup the root file system. I only backup “/home” and other data partitions. I rely on the ability to reinstall, in case of a failed system partition.

Waste in one area here another there leads to a generally wasteful lifestyle & philosophy.

Maybe that happens for you. I follow a relatively simple lifestyle. Not having to worry about removing unneeded packages helps keep it simple.

When the next Leap release comes out, I do a clean install (but keep “/home”). That does a good enough job of cleaning out the accumulated debris. I may also remove “.config”, “.local” and “.cache” to clean out settings debris.

Hi
These days I see little point in backing up the OS, I can rollout a new install and have it reconfigured in less than an hour, for Leap and SLE (x86_64 and aarch64) I download once, deploy to many via SuMA, Tumbleweed on the fly… I tend to use qemu (reduced physical systems, a bonus!) more these days for my test setups, at the end of the day, different strokes for different folks…