Review your favorite Linux distribution.
Go Back > Forums > Linux Forums > Linux - Software
User Name
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.


  Search this Thread
Old 03-07-2005, 08:52 PM   #1
LQ Newbie
Registered: Mar 2005
Posts: 11

Rep: Reputation: 0
wget - retrieving one folder of website

Im currently trying hard to do the following:
I need to copy the contents of a websites folder, eg http://website/folder1
The problem Im getting is that wget tries to get all of the files of the server http://website, and also, if there any href to an external site like, then wget tries to retrieve amazon as well, what params should I run wget with to stop this behaviour, I mean, Id just like to do something like the imaginary "cp -fr http://website/folder1 . " and just that, no parents retrieved, no external links
So far, all Ive found is
wget -t0 -m http://website/folder
but doesnt work for my purposes
Thanks a lot
Old 03-07-2005, 09:03 PM   #2
Senior Member
Registered: Dec 2003
Location: Paris
Distribution: Slackware forever.
Posts: 2,255

Rep: Reputation: 87
wget http://adress/folder/*.* should work well
Old 03-07-2005, 09:06 PM   #3
LQ Newbie
Registered: Mar 2005
Posts: 11

Original Poster
Rep: Reputation: 0
Wildcards not allowed in http...
Old 03-07-2005, 09:17 PM   #4
Senior Member
Registered: Oct 2003
Posts: 3,057

Rep: Reputation: 58
no parents retrieved, no external links
I never ran into the external links problem but I'm using ftp sites which may be a fix ???
Anyway, I use cut dirs to not download the full path and Exclude to not download some directories.
For example....
If you want to exclude one or more directories from the download, you can use the X option.
wget -cmq -nH --passive \
-X /pub/linux/suse/suse/i386/8.2/suse/src \

Note: the --cut-dirs option. That's used with -nH to avoid the recreation of the ftp site directory hierarchy
 wget -cm -nH --passive --cut-dirs=7 \
 -X /sites/ \

wget -cm -nH --passive --cut-dirs=9 \ \
Old 03-07-2005, 09:20 PM   #5
LQ Newbie
Registered: Mar 2005
Posts: 11

Original Poster
Rep: Reputation: 0
thx a lot


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
wget through an authenticated website davee Linux - Networking 2 02-11-2009 05:57 AM
retrieving wiki pages with wget pete-theobald Programming 4 07-20-2005 10:28 AM
Create New Folder in KDE for Website AndeAnderson Linux - Newbie 3 04-11-2005 05:17 PM
wget succeeds in retrieving a virus ftp through a firewall ! conner_f Linux - Security 9 07-06-2004 02:28 AM
Website Hosting and file/folder ownership vbfischer Linux - Security 2 07-01-2004 01:31 PM

All times are GMT -5. The time now is 11:33 AM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration