UbuntuThis forum is for the discussion of Ubuntu Linux.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
First off, I don't know if this post should be here or another place however I am trying this on an Ubuntu 18.04 LTS machine and having difficulties.
I have an Ubuntu machine that is not able to reach the internet (due to security restrictions). I downloaded the entire repository so I could bring into the area with the "unattached" machine to update.
I believe I have a good copy of the current repositories and when I take them to the other machine (using a USB attached hard drive), I edit the sources.list file and have it point to: file:///my/file/location/ubuntu us.archive.ubuntu.com/ubuntu/ bionic main restricted universe multiverse
Also bringing in the backports and updates and security. On the isolated machine I type (as a root user) apt update, I get a lot of ignores and when I do an apt list --upgradable I get nothing.
I know there are a LOT of updates this machine needs. Can someone help me figure out what the heck I'm doing wrong?
Unfortunately we cannot use Apache (or a web server for that matter) and have to specify file:: in the sources.list file. We've done it before with a different system and it worked fine now... I can't figure out what I'm doing wrong.
Your best bet is to download apt-offline or apt-offline-gui from Ubuntu repositories from a computer with internet access. You'll have to make sure to have dependencies installed or download/install them also.
If you do not have access to a computer online running some variant of Debian/Ubuntu, you might want to make a Ubuntu live USB key or DVD to use apt-offline on a computer with internet access.
That was our preference to do apt-offline however, due to the nature of the actual offline computer, and the necessity to install apps at any given time, a decision was made to do things this way. Also, due to the nature of the computer and it's functions, we are not able (by policy) to bring anything off of that computer to the "online" one so, that kinda puts the cabash on using apt-offline.
That was our preference to do apt-offline however, due to the nature of the actual offline computer, and the necessity to install apps at any given time, a decision was made to do things this way. Also, due to the nature of the computer and it's functions, we are not able (by policy) to bring anything off of that computer to the "online" one so, that kinda puts the cabash on using apt-offline.
Thank you,
Brian
Thing is:
You need to update repo metadata, since packages change due to upgrades/additions/deprecations/security patches etc. The /var/lib/apt/list/* have to match up with available packages in the repo. If I remember correctly, repositories are updated once a day, which means you need to do an "apt update" and capture all packages in the repo before the repo gets it's daily update.
So there are two way to do the "apt update" to update package lists, either through apt-offline, or run the same Ubuntu live on a computer with internet access, run the apt-update command, grab a copy of everything in /var/lib/apt/list/ except the two folders and a lock file, then delete everything except the two folders and lock file in /var/lib/apt/lists/ of the offline computer, then replace them with the ones you got from the live session.
You do not need to set up a repo with the downloaded packages, you just need to symlink the packages to /var/cache/apt/archives. What I have done in the past, is to make a directory called 'archives' in my NTFS data partition with the auxfiles and partial folders and lock file within, and symlink it to /var/cache/apt/. I don't get permission issues keeping the cache on an NTFS formatted partition.
EDIT: If you choose to keep the packages in the cache rather than create a local repo, make sure apt is configured not to clean the cache after a transaction.
Ok...I'll have to try that. Question: To get the data from my online Ubuntu machine, I'm running apt-mirror and loading that onto a USB drive and taking in. Does the apt-mirror do that or not?
Also... I forgot. Do you know the time of day the repository is updated?
Depends on which mirror, I usually set my sources to a local mirror and they typically do it after midnight. As for apt-mirror updating metadata in the offline machine... You'll have to check the creation dates of the lists files in the offline machine, since I don't know, never set up a mirror. I typically would just upgrade my laptop which would be a mirror of the OS on my offline desktop and transfer all lists and packages to appropriate directories in the offline machine and don't run apt update, just apt-upgrade.
Thing is:
You need to update repo metadata, since packages change due to upgrades/additions/deprecations/security patches etc. The /var/lib/apt/list/* have to match up with available packages in the repo. If I remember correctly, repositories are updated once a day, which means you need to do an "apt update" and capture all packages in the repo before the repo gets it's daily update.
So there are two way to do the "apt update" to update package lists, either through apt-offline, or run the same Ubuntu live on a computer with internet access, run the apt-update command, grab a copy of everything in /var/lib/apt/list/ except the two folders and a lock file, then delete everything except the two folders and lock file in /var/lib/apt/lists/ of the offline computer, then replace them with the ones you got from the live session.
You do not need to set up a repo with the downloaded packages, you just need to symlink the packages to /var/cache/apt/archives. What I have done in the past, is to make a directory called 'archives' in my NTFS data partition with the auxfiles and partial folders and lock file within, and symlink it to /var/cache/apt/. I don't get permission issues keeping the cache on an NTFS formatted partition.
EDIT: If you choose to keep the packages in the cache rather than create a local repo, make sure apt is configured not to clean the cache after a transaction.
Thanks Brains... Unfortunately, I copied that directory from the "online" machine to the "offline" one, then did the apt update and got the same thing. I get a lot of "Get 1 Blah, blah blah
Ign 1 Blah, blah blah"
Almost every line entry is read then ignored. Does anyone know why this would happen?
You do not need to set up a repo with the downloaded packages, you just need to symlink the packages to /var/cache/apt/archives. What I have done in the past, is to make a directory called 'archives' in my NTFS data partition with the auxfiles and partial folders and lock file within, and symlink it to /var/cache/apt/. I don't get permission issues keeping the cache on an NTFS formatted partition.
EDIT: If you choose to keep the packages in the cache rather than create a local repo, make sure apt is configured not to clean the cache after a transaction.
I did set up a repo by editing the sources.list file on the offline computer. I did it that way in case I needed to install something from the repo. Do I still need to do something with the /var/cache/apt/archives directory?
So, what I did was perform an apt-mirror and gathered the repo (about 330+ GB) onto a USB drive. I edited my sources.list file on the offline machine to point to the USB drive (when plugged in) and copied the /var/lib/apt/lists directory from the online machine (after an apt update) to that same USB drive. I then copied that directory from the USB drive to /var/lib/apt/lists to that directory on the offline machine and ran apt update on that machine. That's where I keep getting the Get/Ign errors.
Do I still need to do something with the /var/cache/apt/archives directory?
No, not if you are going to use your own repo.
Quote:
Originally Posted by Deadeye412
I edited my sources.list file on the offline machine to point to the USB drive (when plugged in) and copied the /var/lib/apt/lists directory from the online machine (after an apt update) to that same USB drive. I then copied that directory from the USB drive to /var/lib/apt/lists to that directory on the offline machine and ran apt update on that machine. That's where I keep getting the Get/Ign errors.
Based on what I've read in this article, when using your own mirror the lists should be included and you should not need to get the lists from an online computer after update. I did make this recommendation only if you don't use apt-mirror, where you would do updates on the on-line computer and transfer files to the off-line to update.
If you are going to use your own mirror, you should not need to transfer anything, once a proper link to your repo is created in /etc/apt/sources.list, things should be handled normally with apt-update and apt-upgrade commands.
The Get/Ign errors are a result of entries in /etc/apt/sources.list that have a broken link to the repo, either because it points to a web address and you have no internet, or the link to your external USB repo is not properly configured. According to the article I read, your private repo still needs to be through a network, not necessarily hooked up to the internet, but through a network. This does not mean it's the only way as it's the only article I read and have never used apt-mirror to create my own repo.
Based on what I've read in this article, when using your own mirror the lists should be included and you should not need to get the lists from an online computer after update. I did make this recommendation only if you don't use apt-mirror, where you would do updates on the on-line computer and transfer files to the off-line to update.
If you are going to use your own mirror, you should not need to transfer anything, once a proper link to your repo is created in /etc/apt/sources.list, things should be handled normally with apt-update and apt-upgrade commands.
The Get/Ign errors are a result of entries in /etc/apt/sources.list that have a broken link to the repo, either because it points to a web address and you have no internet, or the link to your external USB repo is not properly configured. According to the article I read, your private repo still needs to be through a network, not necessarily hooked up to the internet, but through a network. This does not mean it's the only way as it's the only article I read and have never used apt-mirror to create my own repo.
Sorry it took so long to get back to you (vacations and holidays got in the way). First off, than you for all the help you've given me. You don't know how much I appreciate it.
I have a way of making the USB drive accessible over a network (instead of directly on the machine). I tried that and changed the sources.list file to point to that using http://. I still get the same errors. It appears to read a plethora of repository information then ignores it all.
Do you have any more tricks up your sleeve or something else I could try? Anything would be appreciated.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.