LinuxQuestions.org
Help answer threads with 0 replies.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 08-01-2009, 11:12 AM   #1
alan_daniel
LQ Newbie
 
Registered: Apr 2004
Posts: 26

Rep: Reputation: 15
Need help shaping an advanced wget command


Hey, I have a wget command that I want to make to automate the downloading of archived audio podcasts of the NPR show Wait, Wait, Don't Tell Me. They just redesigned their website, and it now uses a conventional approach to archived shows and their mp3 files. And yes, you can download them in individual segments (around 5 or 6 per show) from the Wait Wait website, but reading the URL they come from shows the server they are hosted on. Here's how the show works:

- The show is a weekly show, broadcasted every Saturday of the year (so all dates will be Saturdays).
- The URL seems to follow this general format:
Code:
http://public.npr.org/anon.npr-mp3/npr/waitwait/2007/01/20070113_waitwait_01.mp3
The example given here is the 1/13/07 show, so you can see how the date is formatted in the URL. At the end of the mp3's filename, the "_01" part, is the numbered segment of the show. Each show has different numbers of pieces, but obviously the URL doesn't exist for segment numbers that are out of bounds for that particular show.

I would ideally like to be able to run one wget command that downloads a month's worth (or a year's would be even better) of a show and places them upon downloading into appropriate directories. For instance (and it doesn't have to be this), creating a 2007 shows folder, then putting all the January shows in one folder, all the February shows in another, etc.

I've tried reading my way through the man page for wget, but I'm having a hard time coming up with a fully-automated command for this. I know this is a very unique and probably difficult question, but I didn't know where else to go, and I've always been able to find experts here.
 
Old 08-01-2009, 01:09 PM   #2
acid_kewpie
Moderator
 
Registered: Jun 2001
Location: UK
Distribution: Gentoo, RHEL, Fedora, Centos
Posts: 43,417

Rep: Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985
curl is a better tool for this, as you can use pattern expansions to cover multiple URLS, e.g. curl -O http://domain.com/my_file{01-20}.dat. Check the curl manpage for full usage details.
 
  


Reply

Tags
wget



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Help with advanced tar, ls, and egrep command... jademan83 Linux - Newbie 1 03-05-2009 12:28 PM
wget command needs your help please help! muasif80 Linux - Newbie 8 12-17-2008 04:49 AM
Need help with wget command learners Linux - Software 0 09-04-2007 07:28 AM
Advanced wget with google images Darek84CJ Linux - General 1 10-23-2006 09:19 AM
Advanced wget... michelbehr Linux - Newbie 3 05-25-2004 03:47 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 03:38 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration