LinuxQuestions.org
Help answer threads with 0 replies.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 03-16-2007, 05:40 AM   #1
fc6_user
Member
 
Registered: Jan 2007
Location: Montpellier, France
Distribution: Fedora Core 6, Mandriva, Knoppix, Debian
Posts: 143

Rep: Reputation: 15
Incremental Downloading: wget http://www.mysite.org/pic[001-125].jpg


Let's say I wanted to download the following:

http://www.mysite.org/pic001.jpg
http://www.mysite.org/pic002.jpg
http://www.mysite.org/pic003.jpg
...
http://www.mysite.org/pic125.jpg

Is there a fast way to do this using wget, scripts or the command line; that is, without downloading special software?

I'd like to do the following, but, of course, it doesn't work:

wget http://www.mysite.org/pic[001-125].jpg

Many thanks!
 
Old 03-16-2007, 06:16 AM   #2
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 65
curl will allow you to do what you want:
Code:
curl 'http://www.mysite.org/pic[001-125].jpg' -o "pic#1.jpg"
 
Old 03-17-2007, 03:12 AM   #3
fc6_user
Member
 
Registered: Jan 2007
Location: Montpellier, France
Distribution: Fedora Core 6, Mandriva, Knoppix, Debian
Posts: 143

Original Poster
Rep: Reputation: 15
Thanks matthiewg42! It works!! Brilliant!!!

I had a look on several sites to get more information on the cURL command, but have found very long and comprehensive descriptions which don't necessarily tell me what I want to know. I thought you might have the answers to my questions, and, moreover, that your answers could be of great use to a great number of people out there. So, I'll put my questions to you. If you can answer them, great! If not, well, nobody knows everything...

What does '-o' do?

How does the "pic#1.jpg" work? The file name will begin with 'pic' and end with '.jpg', but I don't understand the middle part: How about the '#' and '1' bit? What does the '#1' do?

I noticed that the file is saved in the folder you are in when you do the command. Are there other ways of saving the file in a specific directory?

If you don't know how many jpg pictures there are, is there a way of telling cURL to stop when it encounter a 'non-jpg' file? For instance, if there are 18 jpgs and the '...19.jpg' doesn't exist.

Is there a similar command for DOS in Windows? Does cURL work in DOS?

Are there any useful sites for downloading web sites? For example, a Table of Contents and all of its associated links?

Another problem encountered when saving chapters is that the notation isn't necessarily consistent: You could have Chapter 1, Section 1.1, 1.2, 1.3, 1.3.1, 1.3.2, 1.3.2.1, 1.4, Chapter 2, 2.1, etc. It's cool when all the links work even when you're not connected to the Net (after saving the file)... It can work if you put everything in the same folder. Any advice here?

Any other useful features, options or functionality?

I must say, I've only started learning the command line a couple months ago, and I'm very impressed indeed!

Many thanks!!

Last edited by fc6_user; 03-17-2007 at 03:14 AM.
 
Old 03-17-2007, 03:37 AM   #4
jschiwal
LQ Guru
 
Registered: Aug 2001
Location: Fargo, ND
Distribution: SuSE AMD64
Posts: 15,733

Rep: Reputation: 682Reputation: 682Reputation: 682Reputation: 682Reputation: 682Reputation: 682
The #1 is replaced with the number from the range [001-125]. If you had a second range it would be #2.
curl 'http://www.mysite.org/collection[1-9]/pic[001-125].jpg' -o "collection#1_pic#2.jpg" would safe files named "collection1_pic001.jpg" ... "collection9_pic125.jpg".
The -o is the argument for an output filename. Otherwise the standard output is used. If you use -O then the file is saved by the same name:
curl -O 'http://www.mysite.org/pic[001-125].jpg'

To clone a website use wget.
The wget manpage has examples.
 
Old 03-20-2007, 05:03 AM   #5
fc6_user
Member
 
Registered: Jan 2007
Location: Montpellier, France
Distribution: Fedora Core 6, Mandriva, Knoppix, Debian
Posts: 143

Original Poster
Rep: Reputation: 15
Thanks jschiwal!

If anybody has any answers to some of my other questions, I'd certainly appreciate it. What I'd like to be able to do is to save an entire website, that is, all of the links of a table of contents of a webbook, for example.

Many thanks.
 
Old 04-04-2007, 10:25 AM   #6
fc6_user
Member
 
Registered: Jan 2007
Location: Montpellier, France
Distribution: Fedora Core 6, Mandriva, Knoppix, Debian
Posts: 143

Original Poster
Rep: Reputation: 15
Command for converting filenames from upper- to lower-case

Another similar or somewhat related question...

I need to convert all of the files in a directory to lowercase. The file names are DOGGY.PAS, KITTY.PAS, and I need to convert them all to lower case (doggy.pas, kitty.pas). Is there an easy way of doing this from the terminal? (It's a bunch of Pascal mini-programs/exercises whose file names are all in uppercase). I couldn't find a solution on the web. I read something about IUCLC, but don't think it applies to this sort of thing...

Another general question. In the info and manual pages, if I have a question about a topic but don't know the commands, options and/or arguments, how do you go about finding them?

Many thanks.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
http://www.ubuntuguide.org Bonzodog Ubuntu 64 09-27-2023 01:59 PM
apache http://localhost/mysite ---> http://localhost/mysite/index.php how? ilnli Linux - General 8 06-04-2005 03:23 PM
http://www.freevix.org/?? Maver Linux - Distributions 2 03-25-2005 10:12 PM
What icon theme is that? http://www.gooeylinux.org/bensjunk/wickedlester-july.jpg brynjarh Linux - General 2 08-19-2004 04:54 PM
Downloading www.slackware.org psyklops Slackware 7 10-09-2003 04:50 PM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 07:39 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration