LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Slackware (https://www.linuxquestions.org/questions/slackware-14/)
-   -   a question about multi lib (https://www.linuxquestions.org/questions/slackware-14/a-question-about-multi-lib-4175432965/)

rob.rice 10-18-2012 06:30 PM

a question about multi lib
 
what would be the wget command be to download the multilib files needed for a full install of multilib ?
without downloading the whole website

my head starts to swim reading wget's man page long before I get it worked out

don't get me wrong I'm grateful to have the man page I just wish the most commonly used options came first

T3slider 10-18-2012 09:40 PM

I don't know how to do it with wget off-hand but if you're willing to use lftp, then the following should work:
Code:

$ lftp -c "open http://taper.alienbase.nl/mirrors/people/alien/multilib/14.0/ ; mget *"
If you want the compat32 packages as well then use this instead:
Code:

$ lftp -c "open http://taper.alienbase.nl/mirrors/people/alien/multilib/14.0/ ; mirror -i ."

kingbeowulf 10-19-2012 01:05 AM

I use wget for a number of tasks, but it was just easier to use lftp based on information from Alien Bob's wiki site:

Code:

#!/bin/sh
# Grab multilib packages from Alien Bob site

VERSION=${VERSION:-14.0}
TOPDIR="/data/slackware"

echo "Syncing version multilib packages for '$VERSION' ..."

if [ ! -d ${TOPDIR}/multilib ]; then
  echo "Target directory ${TOPDIR}/multilib does not exist!"
  exit 1
fi

cd ${TOPDIR}/multilib
lftp -c "open http://connie.slackware.com/~alien/multilib/ ; mirror -e -n ${VERSION}"

echo ...done.

or "taper.alienbase.nl/mirrors/..." as appropriate. I think "mirror -e -n" is better than "mirror -i" for this case.

dr.s 10-20-2012 11:29 AM

Quote:

Originally Posted by rob.rice (Post 4809504)
what would be the wget command be to download the multilib files needed for a full install of multilib ?
without downloading the whole website

The syntax below worked for me. It should download recursively, accepting (txz,txt,asc) extensions and GPG and README files, into a parent directory ("multilib") in the local directory. The -nH and cut-dirs will remove the host name and path (mirrors/people/...) otherwise you'll end up with a long local directory structure. You can tweak it to suit your needs.
Code:

wget -r -np -nH -Pmultilib --cut-dirs=5 -A txz,txt,asc,"GPG*","READM*" http://taper.alienbase.nl/mirrors/people/alien/multilib/14.0/


All times are GMT -5. The time now is 05:46 PM.