Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place! |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
|
11-03-2004, 08:52 AM
|
#1
|
LQ Newbie
Registered: Oct 2004
Location: Melbourne, VIC, Australia
Distribution: Ubuntu Debian
Posts: 20
Rep:
|
wget command help
Hi, got a real newbie question here..
When using wget, what are the wildcards in the command?
it doesn't seem to say anything about wildcards in the help..
i.e. if i want to get pages foo1.html to foo19.html from http://www.randomsite.com/~jim/foo???.html
what are the wildcards i put instead of ???
edit: removed extra q'n /edit
thanks,
--munkeevegetable
Last edited by munkeevegetable; 11-03-2004 at 09:01 AM.
|
|
|
11-03-2004, 09:00 AM
|
#2
|
Moderator
Registered: Jun 2001
Location: UK
Distribution: Gentoo, RHEL, Fedora, Centos
Posts: 43,417
|
wget can't do that... lok at curl though, that'll do exactly what you're after.
|
|
|
11-03-2004, 09:02 AM
|
#3
|
LQ Newbie
Registered: Oct 2004
Location: Melbourne, VIC, Australia
Distribution: Ubuntu Debian
Posts: 20
Original Poster
Rep:
|
heh, i know that it does it, because my brother always uses wget to save that way.. i just wish i could remember the wildcards he used..
|
|
|
11-03-2004, 09:06 AM
|
#4
|
LQ Newbie
Registered: Oct 2004
Location: Melbourne, VIC, Australia
Distribution: Ubuntu Debian
Posts: 20
Original Poster
Rep:
|
Quote:
Originally posted by munkeevegetable
heh, i know that it does it, because my brother always uses wget to save that way.. i just wish i could remember the wildcards he used..
|
just remembered, twas maybe images he saved, not websites... not sure.. it works for one website, and recursively for images..
hmmms..
|
|
|
11-03-2004, 09:08 AM
|
#5
|
Moderator
Registered: Jun 2001
Location: UK
Distribution: Gentoo, RHEL, Fedora, Centos
Posts: 43,417
|
wget can do recursive gets using the -m option, but i really don't think it can actually handle patterns
|
|
|
11-03-2004, 09:08 AM
|
#6
|
LQ Newbie
Registered: Oct 2004
Location: Melbourne, VIC, Australia
Distribution: Ubuntu Debian
Posts: 20
Original Poster
Rep:
|
okee doke. thanks for your help
ill have a go.. if it doesnt work, ill just go thru one by one.. or have a look at curl
--m.v
|
|
|
11-03-2004, 10:23 AM
|
#7
|
LQ Newbie
Registered: Nov 2004
Posts: 1
Rep:
|
In fact....
you can use shell script to do that
For example:
#!/bin/bash
for (( i = 1; i <= 19; i++ ))
do
wget -c -t 0 http://www.randomsite.com/~jim/foo$i.html
done
|
|
|
11-03-2004, 02:15 PM
|
#8
|
Member
Registered: Aug 2004
Location: internet
Distribution: slackware
Posts: 135
Rep:
|
|
|
|
11-03-2004, 07:37 PM
|
#9
|
Member
Registered: Dec 2002
Distribution: Slackware
Posts: 927
Rep:
|
but I believe that will pull everything 2 levels down from that URL, that might be much more than the specific files munkee was looking to grab.... (pls correct me if I'm wrong)
|
|
|
All times are GMT -5. The time now is 06:48 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|