Originally Posted by rob.rice
what would be the wget command be to download the multilib files needed for a full install of multilib ?
without downloading the whole website
The syntax below worked for me. It should download recursively, accepting (txz,txt,asc) extensions and GPG and README files, into a parent directory ("multilib") in the local directory. The -nH and cut-dirs will remove the host name and path (mirrors/people/...) otherwise you'll end up with a long local directory structure. You can tweak it to suit your needs.
wget -r -np -nH -Pmultilib --cut-dirs=5 -A txz,txt,asc,"GPG*","READM*" http://taper.alienbase.nl/mirrors/people/alien/multilib/14.0/