Use wget to download multiple files with wildcards
Linux - NetworkingThis forum is for any issue related to networks or networking.
Routing, network cards, OSI, etc. Anything is fair game.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
SDN 101: An Introduction to Software Defined Networking
Discover the advantages of SDN.
SDN has quickly become one of the hottest trends in IT. But not all SDN solutions offer real software-defined functionality. As more enterprises consider SDN, they want to know, “What is SDN? And what are the real benefits?” If you're ready to explore the advantages of SDN, and want to know how it should be implemented within your enterprise, start by reading our introductory white paper.
Click Here to receive this Complete Guide absolutely free.
You want to download all the GIFs from a directory on an HTTP
server. You tried wget http://www.server.com/dir/*.gif, but that
didn't work because HTTP retrieval does not support globbing. In
that case, use:
More verbose, but the effect is the same. -r -l1 means to retrieve
recursively, with maximum depth of 1. --no-parent means that ref‐
erences to the parent directory are ignored, and -A.gif means to
download only the GIF files. -A "*.gif" would have worked too.
man pages are not exactly light reading, but they usually have the answer.
Brother I have already tried what u had mentioned...It works for certain sites...Indeed I use this for downloading entire site..
check my first post..
Here some permission problem.. "403 forbidden"...
It doesn't make any difference here...
So can I expect a constructive reply ..
Ok here is the output of the command :
Resolving www.mikeswanson.com... 18.104.22.168
Connecting to www.mikeswanson.com|22.214.171.124|:80... connected.
HTTP request sent, awaiting response... 403 Forbidden
22:24:41 ERROR 403: Forbidden.
Removing www.mikeswanson.com/wallpaper/images/index.html since it should be rejected.
unlink: No such file or directory
Downloaded: 0 bytes in 0 files
Last edited by Anant Khaitan; 11-30-2007 at 11:21 AM.
Oops. I guess we both screwed up a little then.
Damn font makers need to make a good default font that you can tell the difference between | and l and 1. -_-
But, again, note the 403 forbidden error message you got. With the way mikeswanson.com has it's folders setup you can't view/download from a directory unless you know the exact filename you want to download. (Sometimes not even then.)