Welcome to the most active Linux Forum on the web.
Go Back > Forums > Linux Forums > Linux - Software
User Name
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.


  Search this Thread
Old 08-01-2009, 11:12 AM   #1
LQ Newbie
Registered: Apr 2004
Posts: 26

Rep: Reputation: 15
Need help shaping an advanced wget command

Hey, I have a wget command that I want to make to automate the downloading of archived audio podcasts of the NPR show Wait, Wait, Don't Tell Me. They just redesigned their website, and it now uses a conventional approach to archived shows and their mp3 files. And yes, you can download them in individual segments (around 5 or 6 per show) from the Wait Wait website, but reading the URL they come from shows the server they are hosted on. Here's how the show works:

- The show is a weekly show, broadcasted every Saturday of the year (so all dates will be Saturdays).
- The URL seems to follow this general format:
The example given here is the 1/13/07 show, so you can see how the date is formatted in the URL. At the end of the mp3's filename, the "_01" part, is the numbered segment of the show. Each show has different numbers of pieces, but obviously the URL doesn't exist for segment numbers that are out of bounds for that particular show.

I would ideally like to be able to run one wget command that downloads a month's worth (or a year's would be even better) of a show and places them upon downloading into appropriate directories. For instance (and it doesn't have to be this), creating a 2007 shows folder, then putting all the January shows in one folder, all the February shows in another, etc.

I've tried reading my way through the man page for wget, but I'm having a hard time coming up with a fully-automated command for this. I know this is a very unique and probably difficult question, but I didn't know where else to go, and I've always been able to find experts here.
Old 08-01-2009, 01:09 PM   #2
Registered: Jun 2001
Location: UK
Distribution: Gentoo, RHEL, Fedora, Centos
Posts: 43,417

Rep: Reputation: 1975Reputation: 1975Reputation: 1975Reputation: 1975Reputation: 1975Reputation: 1975Reputation: 1975Reputation: 1975Reputation: 1975Reputation: 1975Reputation: 1975
curl is a better tool for this, as you can use pattern expansions to cover multiple URLS, e.g. curl -O{01-20}.dat. Check the curl manpage for full usage details.



Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
Help with advanced tar, ls, and egrep command... jademan83 Linux - Newbie 1 03-05-2009 12:28 PM
wget command needs your help please help! muasif80 Linux - Newbie 8 12-17-2008 04:49 AM
Need help with wget command learners Linux - Software 0 09-04-2007 07:28 AM
Advanced wget with google images Darek84CJ Linux - General 1 10-23-2006 09:19 AM
Advanced wget... michelbehr Linux - Newbie 3 05-25-2004 03:47 AM > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 12:01 PM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration