Help - Search - Members - Calendar
Full Version: Apt-get
Linuxhelp > Support > Technical Support
If you are using apt-get on more then one linux box , with same distributions [for example Red Hat 9],this will help
you , so you don't have to download everytime that 80-100MB update.

NOTE : only apt-get

INFO : apt-get stores it's downloaded files in /var/cache/apt/archives [and won't delete them after you update you're linux box]

So all you have to do is :
1.install apt-get [skip if you already have it installed]
2.copy the /var/chache/apt/archives from the updated computer to the computer you wan't to update , to THE SAME folder /var/chache/apt/archives [ don't copy LOCK from the folder ]
3.apt-get update
4.apt-get dist-upgrade

NOTE : apt-get will check it's archives folder to see wheter the files are there or not, and will ONLY download files that are newer, or are missing from the archives.

Robert B
IF you don't have it set to auto clean ....

After the last movement, clean'um out!

And, if you really want to get fancy, you can either make an NFS directory on one server and update each one from it without copying the files ... or setup your own apt-server and rsync from the other servers to maitain it up2date (usually only worth maitaining your own server if you have more than 20-30 machines).
I've got 65 computers , currently only 10 linux.

Is there a howto somewhere on how to do that APT server thingie? smile.gif

Robert B
or is there a way , or some config file editing that can make apt get to first check on one of my file servers for the update file, and only after that check on the net?

Robert B
Probably the easiest thing to do is create one place that becomes your new apt cache archives directory ... then mount that shared directory on all your like computers ... they can all use it as their apt cache archives.

You can use either samba or NFS to share directories between computers ... and since this is a Linux share, I recommend using NFS.

Pick a computer with a large hard drive (it will be holding all the apt cached packages) and setup an NFS export on that computer (of /var/cache/apt/archives) (this PC will be known as the NFS server for the rest of my post)... we have several guides to help in the setup of NFS:

This one ... it talks about setting up your kernel ... not required on a stock RedHat kernel install ... other good info there about /etc/exports though.

Here is an NFS setup guide on our TLDP mirror.

Here is a thread talking about trying to use a firewall with NFS (alot of other good info in that thread about NFS) .... there is a link in that thread to a site that discusses how to tie down NFS services to specific ports (a good idea in case you want to standup iptables on your linux machines).

So, you want to setup a NFS server that exports the /var/cache/apt/archives directory first, then connect to it from your other machines.

Actually you can either export /var/cache/apt/archives on your NFS server ... or you can change where directory is with this code inside the apt.conf file:
// Location of the cache dir
 Cache "var/cache/apt/" {
    archives "archives/";
    srcpkgcache "srcpkgcache.bin";
    pkgcache "pkgcache.bin";

So once NFS is running on the NFS server and set to startup on reboot, you would export your /var/cache/apt/archives directory with an entry similar to this:

(if the IPs of your network are 192.168.0.x)

Your client(s) would just empty the current /var/cache/apt/archives directory (including everything under archives ... so archives exists and is empty) and issue the command:

mount nfs_server_name:/var/cache/apt/archives /var/cache/apt/archives

Once eveything is working by hand, you can put the mount inside your /etc/fstab (on the clients) so that they mount /var/cache/apt/archives every time the computer starts ... see chapter 4 of the above NFS Setup Guide, bottom of the page....The NFS server would need to be on all the time that people might be needing to use apt.

you will probably need to make the /var/cache/apt/archives directory on the NFS server writeable by everyone with the command:

chmod 777 /var/cache/apt/archives
Thank you very much smile.gif

I would need you're advice on this...

Is it good if I automate the update?, I mean using this good? or bad? what is you're opinion?

I mean, I create a NFS server where I install apt-get and check for updates every day at 1:00, and I make my clients check for updates every day at 3:00. Would this be good?

I would like to hear what you think about this, is this a good Idea, or a really bad one?

Robert B
I personally like to run the updates on one machine ... test it for a couple days to make sure it's not broken, then update all the other machines later.

For example, the latest Fedora Core kernel causes one of my Dual Processor SMP machines to freeze ... so I had to remove the kernel. (the bug is in another post and is related to a firewire driver).

I am not one who thinks auto updates are good (I just like to watch the updates install and make sure the machine is still OK) ... but an NFS server so you only need to download once is very good, as well as being able to install the updates via SSH.

You can use can use command line switches and automate the updates, but I wouldn't do it on my machines.
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Invision Power Board © 2001-2018 Invision Power Services, Inc.