Visit Jeremy's Blog.
Go Back > Forums > Linux Forums > Linux - Newbie
User Name
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!


  Search this Thread
Old 06-26-2014, 04:54 AM   #1
LQ Newbie
Registered: May 2014
Posts: 11

Rep: Reputation: Disabled
How to download multiple files

Hey all,

I feel this is a really silly question but is there a way to download a directory from these type of sites?

For example, I want to download the Wine folder but when I click on it, it goes into another directory, which leads into another directory, which then I must download each file individually and notepad files seem to require me to create a new text file, copy and paste the code and save them.

Is it possible to download the entire directory? I feel like I am doing something wrong.

Old 06-26-2014, 05:30 AM   #2
Senior Member
Registered: Aug 2011
Location: Dublin
Distribution: Centos 5 / 6 / 7
Posts: 2,939

Rep: Reputation: 1211Reputation: 1211Reputation: 1211Reputation: 1211Reputation: 1211Reputation: 1211Reputation: 1211Reputation: 1211Reputation: 1211
Do a google search for wget -r and you'll find plenty of examples.
Old 06-26-2014, 05:33 AM   #3
Registered: Mar 2013
Posts: 633

Rep: Reputation: 141Reputation: 141
Install httrack and run the command
httrack "http://" -O "/tmp/mysite" "+**" -v
It will download all available packages in /tmp as a directory mysite.
Old 06-26-2014, 05:33 AM   #4
LQ Newbie
Registered: May 2014
Posts: 11

Original Poster
Rep: Reputation: Disabled
Thanks a lot TenTenths and eklavya. I really appreciate the help.
Old 06-26-2014, 05:36 AM   #5
Registered: Aug 2003
Location: Wauconda, Illinois, USA
Distribution: Slackware, OpenSuse, Arch Linux on Pi
Posts: 96

Rep: Reputation: 42
There are 2 tools I know of, wget and curl.

This is from the man pages, and I think it is what you need.

Recursive Retrieval Options

    Turn on recursive retrieving. 
-l depth
    Specify recursion maximum depth level depth. The default maximum depth is 5. 
    This option tells Wget to delete every single file it downloads, after having done so. It is useful for pre-fetching popular pages through a proxy, e.g.:

            wget -r -nd --delete-after

    The -r option is to retrieve recursively, and -nd to not create directories.

    Note that --delete-after deletes files on the local machine. It does not issue the DELE command to remote FTP sites, for instance. Also note that when --delete-after is specified, --convert-links is ignored, so .orig files are simply not created in the first place.


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] How to chmod multiple files based on multiple --reference files ? bot.anist Linux - Newbie 10 06-13-2014 02:54 AM
Use wget to download multiple files with wildcards Anant Khaitan Linux - Networking 8 08-23-2013 09:45 PM
[SOLVED] Jigdo download of DVD-7 ALWAYS fails with just 5 files left to download IslandWolf Debian 3 03-12-2011 03:24 AM
Downloader Manager to download multiple files at once dsjellesma Linux - Newbie 1 06-21-2009 05:14 PM
[SOLVED] wget: download multiple files Steve W Linux - Software 6 05-25-2009 11:29 AM > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 03:02 AM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration