Dialup / Offline softare updates

Pardon me if this is not the right forum. I thought there used to be one about software updates. So it was either this one or the Install one.

It’s happened again. I made a 30 mile round trip to the library (had other things to do, too) to get the repodata for the updates. I noticed the update date on the files was the 23rd and today was 29th. I checked right before I left and the files were still the 23rd. So I tried to be optimistic. By the time I got home and tried to use the files, they had updated them.

After finding merely copying the files to the repodata directory did not work, what I normally do in Yast is to run the Online Update and as soon as it starts and creates the temporary update directory in /var/cache/zypp/raw/, I click abort. This gives me 30 seconds to copy my files into the directory, the update, seeing that the server and the directory files are the same, then moves them over and all is fine. When the files have been updated in the meantime, I’m out of luck with the knowledge I have.

The reason I have to go to the library is that repodata file filelists and primary are over 14MB! That is basically out of reach for me on dialup. Once, I thought I would try early in the morning hoping they wouldn’t update them during the time. I verified they had just been updated the previous day and started my download manager. After several hours, I noticed that it was having problems at 90% on the last one. I found out they had updated them during that time.

So the question is, am I out of luck? I realize I can’t get the current updates, but could I at least use the ones I just downloaded? It would work for awhile. If one of the software files are updated and they remove the one in my older filelist, then I’m out of luck, but at least I might have a few days to get things updated. Is there some way to tell the update software to use the current files I put in the repodata directory?

I had turned off all the automatic updates I could find to prevent my internet use from basically being disabled, so don’t know if that has an effect. I couldn’t manage to do that in 12.1 so if I remember, I load Software Management in the background to prevent the packagekit from running.

Well, in your case I would suggest to just uninstall PackageKit and all its dependencies then:

sudo zypper rm PackageKit

Edit:
And you may want to change this in /etc/zypp/zypp.conf:

##
## Amount of time in minutes that must pass before another refresh.
##
## Valid values: Integer
## Default value: 10
##
## If you have autorefresh enabled for a repository, it is checked for
## up-to-date metadata not more often than every <repo.refresh.delay>
## minutes. If an automatic request for refresh comes before <repo.refresh.delay>
## minutes passed since the last check, the request is ignored.
##
## A value of 0 means the repository will always be checked. To get the oposite
## effect, disable autorefresh for your repositories.
##
## This option has no effect for repositories with autorefresh disabled, nor for
## user-requested refresh.
##
# repo.refresh.delay = 10

On 2013-08-29 23:26, dt30 wrote:

> It’s happened again. I made a 30 mile round trip to the library (had
> other things to do, too) to get the repodata for the updates. I noticed
> the update date on the files was the 23rd and today was 29th. I checked
> right before I left and the files were still the 23rd. So I tried to be
> optimistic. By the time I got home and tried to use the files, they had
> updated them.

I understand the basic problem: you have a very slow connection, so you
go to the library to do the downloads. But I don’t understand the exact
solution you try to do.

Let me see…

You have to remove packagekit, that’s for sure.

If I had the resources, I would “simply” download and mirror the entire
update repo, and oss/nonoss repos, into a removable hard disk. A library
will probably complain if you download several gigabytes on their network.

Those 30 km are a problem…

I can think of a solution needing two trips, unhurried.

First, you have to replicate the structure of the update repos on your
computer. For now, empty, only the directories. Populate only with the
repository metadata.

Then, reconfigure your system so that it tries to update from those
local repositories, no internet at all.

I think that zypper patch will work till the moment that it asks if the
list is ok, and ask you for permission to proceed. What I’m not sure is
if it will list the package list in sufficient detail, but the idea is
to get that list, and make a second trip to the library and obtain those
packages - it does not matter if they are a week old, because your
machine has the old metadata referring to them.

Then you put those rpms you downloaded in the library in the appropriate
places on your computer local repository… and tell zypper to continue
or start again.


Cheers / Saludos,

Carlos E. R.
(from 12.3 x86_64 “Dartmouth” at Telcontar)

I see I may have not been very clear. I don’t take my computer to the library. It’s a desktop and would be a real pain. I take a flash drive and download the repodata files by clicking on each on and saying “save”. Too bad there aren’t delta repodata files. I can’t imagine the whole list being changed very often.

I’m not sure packagekit is the problem. It’s only a problem when I boot in 12.1 with it trying to update when I dial in. Don’t I need it to update the actual files? Especially since if I can get the new list installed, I dial in and let the system download the delta rpms. It’s not so bad, but some deltas are several megabytes and I can’t find a way to copy the deltas on and have the system use them. When the list gets old, the deltas (and maybe the rpms) are no longer there but the new updated ones. Therefore, my system is looking for the old one in the list and can’t find it.

By the way, when I say “copy them on”, I mean copy them to a download directory that is set up in my repositories.

Now maybe I’m going about this all wrong. Makes sense to me, but maybe there’s a better way. What would be nice is something I saw on the website about custom distros for installation. It lets you configure a certain setup and then you can create an ISO. When I went to install 12.3, if my 12.1 system could have instead looked at itself, caused the server to create an ISO of everything for 12.3 that I had of 12.1, it would have been very convenient. The full DVD ISO is quite big when I don’t need it all. For updates, a similar thing would be nice to create an ISO of all updates. Then, a couple of times a year, or more, I could download the ISO to my flash drive, bring it home, and have my system update from it. Or, for updates, somehow to copy all the files that my specific system needs and copy them all at once in one file.

I don’t think anything like that exists now, so manually copying the repodata files, hoping I can get home fast enough, and then letting the package system download and install the needed deltas is the only way I know. And when a delta or rpm if no delta is taking way too long, I abort the process and write down the file and manually copy the rpm file the next time I go.

On Fri, 30 Aug 2013 17:36:02 +0000, dt30 wrote:

> I take a flash drive
> and download the repodata files by clicking on each on and saying
> “save”.

I would be inclined to make a bootable flash drive to use and use rsync
to sync the repository/repositories you want updates from.

I actually use rsync myself so I have a local repo of the updates
(because I have 3 machines that I update with openSUSE 12.2 updates - no
sense in downloading the updates three times over the 'net).

You just need to make sure you have a good sized flash drive. If your
library won’t let you boot from a flash drive, you can also install cygwin
on the flash drive and use rsync from there.

Jim


Jim Henderson
openSUSE Forums Administrator
Forum Use Terms & Conditions at http://tinyurl.com/openSUSE-T-C

There used to be a project for openSUSE where all the packages needed to install codecs offline were put into a compressed archive. Then you can download the archive and unpack it to a directory. Finally add the directory as a repository.

Perhaps something similar to that can be done for offline users. I wish I had the bandwidth to host it, I would try to spin a dvd image of the upgrade repository.

I should note unofficial project.

On 2013-08-30 19:36, dt30 wrote:
>
> I see I may have not been very clear. I don’t take my computer to the
> library. It’s a desktop and would be a real pain. I take a flash drive
> and download the repodata files by clicking on each on and saying
> “save”. Too bad there aren’t delta repodata files. I can’t imagine the
> whole list being changed very often.

That was clear, I knew you do not take the computer to the library.

>
> I’m not sure packagekit is the problem. It’s only a problem when I boot
> in 12.1 with it trying to update when I dial in. Don’t I need it to
> update the actual files?

No.

You were using zypper or yast. With zypper, you can tell it not to
refresh repodata. You do not need an application that is routinely
trying to automatically update the repodata on its own.


Cheers / Saludos,

Carlos E. R.
(from 12.3 x86_64 “Dartmouth” at Telcontar)

On 2013-08-31 03:11, Jim Henderson wrote:

> I actually use rsync myself so I have a local repo of the updates
> (because I have 3 machines that I update with openSUSE 12.2 updates - no
> sense in downloading the updates three times over the 'net).

For that, I use a shared folder over NFS, for /var/cache/zypp/packages/.
All repos I have configured to save rpms; the first computer downloads
them, the second reuses them. I just need not to updated two computers
at the same time.


Cheers / Saludos,

Carlos E. R.
(from 12.3 x86_64 “Dartmouth” at Telcontar)

On 2013-08-31 04:26, nightwishfan wrote:

> Perhaps something similar to that can be done for offline users. I wish
> I had the bandwidth to host it, I would try to spin a dvd image of the
> upgrade repository.

I don’t know if it would fit a single DVD.

What this people need is a method to calculate a list of files to
download so that another computer can download it.

Perhaps generate the list of currently installed packages, and with that
go to the other computer, connect to find the updates, generate the
list, and download the updates - which is much less than the entire repo.

If this code could be javascript it would run in the browser on any
computer or library.


Cheers / Saludos,

Carlos E. R.
(from 12.3 x86_64 “Dartmouth” at Telcontar)

On Sat, 31 Aug 2013 11:03:10 +0000, Carlos E. R. wrote:

> On 2013-08-31 03:11, Jim Henderson wrote:
>
>> I actually use rsync myself so I have a local repo of the updates
>> (because I have 3 machines that I update with openSUSE 12.2 updates -
>> no sense in downloading the updates three times over the 'net).
>
> For that, I use a shared folder over NFS, for /var/cache/zypp/packages/.
> All repos I have configured to save rpms; the first computer downloads
> them, the second reuses them. I just need not to updated two computers
> at the same time.

That’s an interesting idea - I share my local repos using NFS, so the
update still runs, but your way would only download the needed packages
rather than everything.

Then again, I’ve got a 2 TB drive, so having everything available isn’t a
problem (unless I sync up factory again - I actually ran out of disk
space when I was doing that).

Jim


Jim Henderson
openSUSE Forums Administrator
Forum Use Terms & Conditions at http://tinyurl.com/openSUSE-T-C

On 2013-08-31 19:45, Jim Henderson wrote:
> On Sat, 31 Aug 2013 11:03:10 +0000, Carlos E. R. wrote:

>> For that, I use a shared folder over NFS, for /var/cache/zypp/packages/.
>> All repos I have configured to save rpms; the first computer downloads
>> them, the second reuses them. I just need not to updated two computers
>> at the same time.
>
> That’s an interesting idea - I share my local repos using NFS, so the
> update still runs, but your way would only download the needed packages
> rather than everything.

The exact procedure is a bit more complex.

fstab:


> server.name:/data//repositorios_zypp     /var/cache/zypp/nfs_packages    nfs4    noauto,nofail,_netdev 0 0

I mount manually running:


mount /var/cache/zypp/nfs_packages ; ls /var/cache/zypp/nfs_packages

The ‘ls’ verification is needed because of the ‘nofail’ clause (if it
fails to mount, it says nothing).

That directory contains one subdirectory for each release:


AmonLanc:~ # tree -d /var/cache/zypp/nfs_packages
/var/cache/zypp/nfs_packages
├── 11_2
│   └── packages
│       ├── EXT.P_NVidia
....
├── 12_3
│   ├── EXT_Packman
│   │   ├── Essentials
│   │   │   ├── i586
│   │   │   ├── noarch
│   │   │   └── x86_64
│   │   ├── Extra
│   │   │   ├── i586
│   │   │   ├── noarch
│   │   │   └── x86_64
│   │   └── Multimedia
│   │       ├── i586
│   │       ├── noarch
│   │       └── x86_64
│   ├── OBS_Games
│   │   ├── noarch
│   │   └── x86_64
│   ├── OBS_Gnome_Apps
│   │   ├── noarch
│   │   └── x86_64
│   ├── OBS_KDE3
│   │   ├── noarch
│   │   └── x86_64
....
│   ├── repo-non-oss
│   │   └── suse
│   │       ├── i586
│   │       ├── noarch
│   │       │   └── repo-non-oss
│   │       │       └── suse
│   │       │           └── noarch
│   │       └── x86_64
│   ├── repo-oss
│   │   └── suse
│   │       ├── i586
│   │       ├── noarch
│   │       └── x86_64
│   ├── repo-update
│   │   ├── i586
│   │   ├── noarch
│   │   └── x86_64
│   └── repo-update-non-oss
│       ├── i586
│       └── x86_64
└── LocalRPMs

The names matches the ‘alias’ of each repo, not the ‘name’.

Finally, the ‘/var/cache/zypp/packages/’ directory contains symlinks to
those directories:


> AmonLanc:~ # l /var/cache/zypp/packages/
> total 20
> drwxr-xr-x 5 root root 4096 Aug 30 16:40 ./
> drwxr-xr-x 6 root root 4096 Jul 25 12:54 ../
> lrwxrwxrwx 1 root root   32 Aug 12 16:21 EXT_Packman -> ../nfs_packages/12_3/EXT_Packman/
> drwxr-xr-x 2 root root 4096 Mar  6 13:58 InstallationImage/
> drwxr-xr-x 3 root root 4096 Aug 30 16:47 TEST_JM/
> drwxr-xr-x 3 root root 4096 Aug 11 15:07 openSUSE-12.3-1.7/
> lrwxrwxrwx 1 root root   31 Aug 12 16:21 repo-debug -> ../nfs_packages/12_3/repo-debug/
> lrwxrwxrwx 1 root root   38 Aug 12 16:21 repo-debug-update -> ../nfs_packages/12_3/repo-debug-update/
> lrwxrwxrwx 1 root root   33 Aug 12 16:22 repo-non-oss -> ../nfs_packages/12_3/repo-non-oss/
> lrwxrwxrwx 1 root root   29 Aug 12 16:22 repo-oss -> ../nfs_packages/12_3/repo-oss/
> lrwxrwxrwx 1 root root   32 Aug 12 16:22 repo-update -> ../nfs_packages/12_3/repo-update/
> lrwxrwxrwx 1 root root   40 Aug 12 16:22 repo-update-non-oss -> ../nfs_packages/12_3/repo-update-non-oss/
> AmonLanc:~ #

I use symlinks instead of mounting directly the NFS there, because in
case I forget to mount the NFS (or it fails), a zypper will fail with a
write error, instead of filling the root partition or not reusing a
package that is already in the NFS.

Unfortunately, YaST fails after downloading everything, with a useless
message.

Also a symlink allows for different names on different computers.

Last hint: I create each repo with this command line:


zypper ar  --refresh -k -f -n "description" "url" "alias"

After the first run, the directory is created on root, not NFS; it has
to be moved and symlinked. The name of the directory matches the “alias”
of the repo, not the name: so I use aliases with proper characters for a
filename.

>
> Then again, I’ve got a 2 TB drive, so having everything available isn’t a
> problem (unless I sync up factory again - I actually ran out of disk
> space when I was doing that).

It is far easier to do. I don’t because of the space, and the download
speed: I only have 1 Mbit/s. Which of course is more that what our
friend the OP has…


Cheers / Saludos,

Carlos E. R.
(from 12.3 x86_64 “Dartmouth” at Telcontar)

This sounds interesting. After doing a man rsync, it kind of reminds me of looking at the rpm documentation. Pages after pages of options on a command I have no idea what it does! I could not tell from the description if it only does deltas on the whole directory or pieces of files. If it can do pieces of files, for example …filelists.xml.gz, and only copy over and merge the changes, I would like to know more information about how to do it. I imagine this could work with my dialup.

As far as booting to a flash drive at the library, that would be frowned upon. I don’t know much about cygwin. Seems like I recall some sort of emulator, but I never knew you could use it without installing it or booting it.

On 2013-09-02 00:46, dt30 wrote:
>
> hendersj;2582077 Wrote:

(rsync)

> This sounds interesting. After doing a man rsync, it kind of reminds me
> of looking at the rpm documentation. Pages after pages of options on a
> command I have no idea what it does! I could not tell from the
> description if it only does deltas on the whole directory or pieces of
> files. If it can do pieces of files, for example …filelists.xml.gz,
> and only copy over and merge the changes, I would like to know more
> information about how to do it. I imagine this could work with my
> dialup.

The idea of rsync in this context is that you tell it one day to
replicate the entire update server at opensuse.org, for instance. It
takes a lot of time. But then, you do it again the next day: it only
downloads what has changed compared to your local copy.

The second day it saves time and bandwidth.

But you have to be able to run that command at the library, that’s the snag.


Cheers / Saludos,

Carlos E. R.
(from 12.3 x86_64 “Dartmouth” at Telcontar)

Sorry, I thought maybe you thought I had a laptop.

You were using zypper or yast. With zypper, you can tell it not to
refresh repodata. You do not need an application that is routinely
trying to automatically update the repodata on its own.

And unfortunately, zypper, yast, packagekit, rpm, are all something I just don’t really understand. I thought they were all the same just yast used zypper to do the rpm commands or something.

So based upon what you are saying, packagekit doesn’t do much for my case other than automatically update the files which I don’t want? Evidenced by me having already turned off all autorefresh in my repositories? But isn’t doing that adequate enough? 12.1 seems to have a problem with that, but I haven’t yet noticed 12.3 trying to autoupdate. I clicked refresh for the update to generate the temp raw repodata folder. I hope there’s a better way.

Assuming the metadata is the repodata info about the files, I have the metadata. I copied it into /var/cache/zypp/raw/repo-update/repodata/
But the system does not see the new filelists. I remembered a file version which needed to be updated, copied on the repodata and the fileversion was still the same when I went to Yast and Online Update. Isn’t there some way to tell it look in the repodata and acknowledge the newly copied files?

Having not achieved success very often, maybe my method isn’t going to work out for long. But I think the online updates that I run manually might work since they update the deltas and any big ones I skip and do them later. Would removing packagekit no longer allow that, or if autoupdate is not running, do I need to worry about removing it?

I know I’ve been shown how to get a list of installed packages, and I assume it could be a list of needed to update packages? But either way, having a current system list, and if the server could take that list, know what the updates need to be, then generate a one or few file custom file package for download within a limited time would be great! Think there’s any chance of that becoming a possibility?

Could javascript download a bunch of files automatically to a local flashdrive without being a security issue?

So it would not take something like the 16MB filelist file, see there’s only about a 100KB change in it and only download that 100KB? If not, then the online update already provide that. Excepting it’s not at the library.

Now if cygwin is something that could run like a dos command, I could probably get by with running that. Then a script could be made to use the list generated from my computer and download the necessary files?

In fact, some sort of dos wget commands? I’m not positive I could execute a dos shell there, but if 100+ wget type commands went through, think that would generate any unusual activity report?

Regarding symbolic links, I had put one in my user/Downloads/12.1 directory to user/Downloads/phpMyAdmin since that goes across mulitiple OS versions. However, the Software Management could not see the symbolic links. Should that work? Or is it better I put their symbolic links in the var/cache/zypp/packages?

On 2013-09-02 01:56, dt30 wrote:

> So it would not take something like the 16MB filelist file, see there’s
> only about a 100KB change in it and only download that 100KB? If not,
> then the online update already provide that. Excepting it’s not at the
> library.

Er… I’m unsure.

Let me think.

On a large file, provided the server side has an rsync daemon running
(which is the case), or that you can remotely execute commands (which is
not), then rsync can be used to download the parts that are different.
This is one of the procedures used to repair a bad DVD.ISO download.

But the server side can not be http: of ftp: - it has to be rsync.

I was not thinking of that case, and neither, I believe, was Jim. The
idea was to download the entire update repository, which could be a
thousand files at least. Rsync over http can be used in that case to
download only the files that have a different size or date with the same
name, or those with different name that appear (ie, new files only).

It is the method used to keep a mirror of the repositories or any
download server, to download only new files, to keep both in sync.

I don’t know if you are understanding me. If not, lets hope somebody
else can explain it better.

> Now if cygwin is something that could run like a dos command,

That part I do not know.


Cheers / Saludos,

Carlos E. R.
(from 12.3 x86_64 “Dartmouth” at Telcontar)

On 2013-09-02 01:16, dt30 wrote:
>
> robin_listas;2582090 Wrote:
>> That was clear, I knew you do not take the computer to the library.
>> Sorry, I thought maybe you thought I had a laptop.
>
>> You were using zypper or yast. With zypper, you can tell it not to
>> refresh repodata. You do not need an application that is routinely
>> trying to automatically update the repodata on its own.
> And unfortunately, zypper, yast, packagekit, rpm, are all something I
> just don’t really understand. I thought they were all the same just
> yast used zypper to do the rpm commands or something.

Er… no. There are some common libraries to all of them (libzyp, I
think), or the rpm command. But all of them are independent programs.

> So based upon what you are saying, packagekit doesn’t do much for my
> case other than automatically update the files which I don’t want?

Right.

> Assuming the metadata is the repodata info about the files, I have the
> metadata. I copied it into /var/cache/zypp/raw/repo-update/repodata/
> But the system does not see the new filelists. I remembered a file
> version which needed to be updated, copied on the repodata and the
> fileversion was still the same when I went to Yast and Online Update.
> Isn’t there some way to tell it look in the repodata and acknowledge the
> newly copied files?

I don’t know what exact files contain the repositories metadata, and
where to store them locally so that “things work”. This is something we
never do, because it happens automatically when you have cheap internet.
I can imagine the idea and what to do, but not the exact details in
order to tell you.

From the update repo the ‘metadata’ is:

file: openSUSE:12.3:Update.repo

directory: repodata/

And the ‘data’ would be the ‘noarch’, ‘nosrc’, ‘x86_64’ directories, at
least.

> Having not achieved success very often, maybe my method isn’t going to
> work out for long. But I think the online updates that I run manually
> might work since they update the deltas and any big ones I skip and do
> them later. Would removing packagekit no longer allow that, or if
> autoupdate is not running, do I need to worry about removing it?

I don’t think it would affect anything for the worse. It would let you
decide when exactly to update what or nothing, even the metadata.

When I’m not at home, with my laptop, I be careful that packagekit is
removed. My cellular connection is limited to 500MB/month, I can not
really afford automatic updates, not even checking for them. For that I
go to a friend’s home and use his connection. Or to a shopping mall,
perhaps (never succeeded there, I’m afraid). And I carry the laptop. You
can not. :frowning:

So far I had no need to develop a method going to a library with a disk
only. I know how to do it from one computer to another using a disk, but
one near to the other, not across 30 KM. My method needs two trips minimum.


Cheers / Saludos,

Carlos E. R.
(from 12.3 x86_64 “Dartmouth” at Telcontar)