Wget https repository support for offline computer

Hey guys. I’m trying to use apt-offline for my offline/cold computer. It’s a real pain in the @@@. To make a long story short, my offline computer has a faulty bios which requires a minimum ubuntu live USB install, so a number of missing dependencies are (one or two) required to get apt-offline working on the offline computer. I figure a workaround this issue is using wget, but wget defaults to http…are there any issues with with using http to retrieve files from repos, and if so should I recompile wget to use only https? Do repos even support that or is it a waste of time?

One last question…when using apt or apt-get, is that a secure connection or not?

Nevermind the last question. I should think before I talk:

#sudo apt update
Hit:1 http://us.archive.ubuntu.com/ubuntu focal InRelease
Get:2 http://us.archive.ubuntu.com/ubuntu focal-updates InRelease [111 kB]
Get:3 http://us.archive.ubuntu.com/ubuntu focal-backports InRelease [98.3 kB]
Get:4 http://us.archive.ubuntu.com/ubuntu focal-security InRelease [107 kB

Also, installing apps (downloading first) is not a secure connection either:
e.g.:
#sudo apt install nethack-common

jt@dell:~$ sudo apt install nethack-common
Reading package lists… Done
Building dependency tree
Reading state information… Done
The following additional packages will be installed:
nethack-console
The following NEW packages will be installed:
nethack-common nethack-console
0 upgraded, 2 newly installed, 0 to remove and 0 not upgraded.
Need to get 1,747 kB of archives.
After this operation, 5,031 kB of additional disk space will be used.
Do you want to continue? [Y/n]
Get:1 http://us.archive.ubuntu.com/ubuntu focal/universe amd64 nethack-common amd64 3.6.1-1 [613 kB]
Get:2 http://us.archive.ubuntu.com/ubuntu focal/universe amd64 nethack-console amd64 3.6.1-1 [1,134 kB]

One solution - bash-shell solution (default CLI):

wget doesn’t like spaces in the URL, so if you try for example,

wget http://us.archive.ubuntu.com/ubuntu focal/universe amd64 nethack-common amd64 3.6.1-1

It won’t work - neither will truncating the URL or putting it in quotes (" or ').

Just do

apt-cache search <partial name of dependency or full name of dependency>

Once you find the packages you need, then do

apt download <package name 1> <package name 2> …

Using ‘sudo’ before ‘apt’ will still download the package, but give you a warning because ‘download’ doesn’t download the package to the correct location and instead downloads it to whatever directory you issue the command from, so user “_apt” which handles package management doesn’t like it when you put files as root in directories where they don’t belong.
Once downloaded, you can transfer it to your offline machine via USB and just install it as a binary using sudo privileges:

sudo dpkg -i <packagename1> <packagename 2> …

and resolve those dependency issues for apt-offline package on your offline machine, so that you can finally install apt-offline on offline machine:

sudo dpkg -i apt-offline_1.8.2-2_all.deb

Solution number 2 - GUI solution:

Install synaptic if you don’t already have it on your online machine:

sudo apt install synaptic

Run synaptic using ‘sudo’, otherwise the ‘apply’ button won’t work:

sudo synaptic

Use search button to find “dependency(ies)” you need.

Once you find the package(s), click on empty box to left of package name and select “mark for installation”, then click on “apply” button.

Clicking on apply button will create pop-up window called ‘summary’. In the bottom of the window select “Download package files only” by checking the box next to it. Then click next.
You’ll find the downloaded (but not installed) files in /var/cache/apt/archives
Copy the files to your USB, then transfer them to offline machine and repeat as in first solution:

sudo dpkg -i <packagename1> <packagename 2> …

and resolve those dependency issues for apt-offline package on your offline machine, so that you can finally install apt-offline on offline machine:

sudo dpkg -i apt-offline_1.8.2-2_all.deb

I figure a workaround this issue is using wget, but wget defaults to http…are there any issues with with using http to retrieve files from repos, and if so should I recompile wget to use only https?

I found this:

APT verifies the signature of packages. So you do not need to have a form of transportation that provides data authentication.

(source)

Using wget does not verify package signatures so you’d want to make sure to confirm the checksums on anything downloaded.

Do repos even support that or is it a waste of time?

Many (most??) do but it depends on where it’s being hosted, by whom, etc. but you’d still want to verify checksums.

For getting missing packages on an offline machine:

I was able to recursively download a package, vim for example, and all of it’s dependencies into current directory with:

apt-get download $(apt-cache depends --recurse --no-recommends --no-suggests --no-conflicts --no-breaks --no-replaces --no-enhances vim | grep "^\w" | sort -u)

If there’s multiple things you need, no problem, just do:

apt-get download $(apt-cache depends --recurse --no-recommends --no-suggests --no-conflicts --no-breaks --no-replaces --no-enhances vim tmux jq | grep "^\w" | sort -u)

Then, once you get them moved over to your destination machine, you can unpack everything sitting sitting in your current directory with:

sudo dpkg -i *

or, for a specific package:

sudo dpkg -i <package-name>

Thanks again!! I found a way around that issue of clamav files, for example, mostly. I have a computer running clamav-daemon, so I just use wget to download all the files: main.cvd, bytecode.cvd, and daily.cvd, then I just use the ‘diff’ command to make sure the files are identical, but there’s one little problem with that: the computer running clamav-daemon uses daily.cld which is an uncompressed version of daily.cvd, so I can’t use diff command directly or haven’t figured out if I can use diff command directly to compare those two files. I guess I could figure out how to compress the cld file to cvd, and then compare them, but I’ve spent so much time on setting apt-offline, sorta tired of spending more time on it, but truth is I shouldn’t leave any loose ends untied.

I was able to use wget with https for ubuntu repositories, but just because it says ‘https’ I don’t know if wget is actually using it…my impression from the man pages are that curerent versions of wget are able to use https without having to compile wget, but I need to look more into it.
As for offline computer with faulty BIOS, that exactly what I did - just use apt download, moved those files over to offline computer, and then installed them exactly like that: sudo dpkg -i * from the working dir containing the files I wanted to install. Sometimes there were dependency issues, or bash commands that fell under more general package names, so to find those more general packages I used apt-cache search, and for other packages and commands, ‘apt depends’ and ‘apt-rdepends’ also helped, so I didn’t have to go back and forth between offline and online computer numerous times to try to find the packages I needed and all their dependencies.

Anyway, I took the easy way out - just copying the files from my other machine with clamav-daemon to the offline machine, hoping that clamav not running as daemon on offline machine doesn’t cares about using the cld extension when scanning files manually.

Also, clamav-daemon I think checks the integrity of the 3 files it downloads and updates periodically, so I don’t have to do checksums manually, and those files are current.

I also have to figure out a way to get rkhunter, chkrootkit database updates over to the offline machine. I don’t think package upgrades would do the trick because the package sizes might be too large to set something like that as a default action when installing those types of security-related packages that tend to have large databases of malware detection.
Also, another issue is rkhunter setting off false-positives, thinking scripts created by Debian package manager are malicious, thus giving me a ‘warning’ when running it. I took care of lwp-request which is a non-malicious script created by debian pkg manager, but I’m getting like 6 other warnings, and I don’t know how many of them are false-positives, so it will take me a while to sort that out. Fortunately that’s on my online computer. My offline computer is bug-free entirely, at least to the extent that the malware detection software I’ve installed is able to detect.