Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
 |
04-19-2008, 07:45 PM
|
#1
|
LQ Newbie
Registered: Apr 2008
Posts: 2
Rep:
|
Archive Downloader Using Curl Grep Cygwin
Hello,
The following bash script downloads comics from the free 30-day archive. How can the script dowload the comics from membership section using valid username and password? For test purpose only in scripting, the username is user1234 and the password is pass1234. The revised script can login, download, and logout through secure connection. Doable? Thanks in advance.
Code:
#!/bin/bash
# Download comics and put them in the users web directory
# Depends: bsdiff, curl, grep
function download {
SRC=$1
FILE=$2
C_TYPE=`HEAD $SRC|fgrep Content-Type:|cut -d' ' -f2`
[ ! -e "$FILE" -a "$C_TYPE" != "text/html" ] && \
wget -q -t inf -A gif,jpg "$SRC" -O "$FILE" && \
[ x"${FILE##*.}" = x"gif" ] #&& \
# convert "$FILE" "${FILE%.*}.png" && \
# rm -f "$FILE"
echo "$FILE" saved to $PWD ...
}
function changedirectory {
COMICNAME=$1
COMICYEAR=$2
mkdir -p $HOME/public_html/comics/$COMICNAME/$COMICYEAR
cd $HOME/public_html/comics/$COMICNAME/$COMICYEAR
}
if [ -z "$1" ]
then
DATE=`date +"%Y%m%d" -d "-0 day"`
DATEYR=`date +"%Y" -d "-0 day"`
else
DATE=$1
fi
mkdir -p $HOME/public_html/comics/
cd $HOME/public_html/comics/
# 9 ChickWeed Lane
changedirectory 9ChickWeedLane $DATEYR
IMG=`curl -s http://www.comics.com/comics/chickweed/archive/chickweed-${DATE}.html \
| egrep -m 1 -o "/comics/chickweed/archive/images/chickweed[0-9]+.(gif|jpg)"`
download http://www.comics.com$IMG 9ChickWeedLane${DATE}.${IMG##*.}
# Agnes
changedirectory Agnes $DATEYR
IMG=`curl -s http://www.comics.com/creators/agnes/archive/agnes-${DATE}.html \
| egrep -m 1 -o "/creators/agnes/archive/images/agnes[0-9]+.(gif|jpg)"`
download http://www.comics.com$IMG Agnes${DATE}.${IMG##*.}
# Alley Oop
changedirectory AlleyOop $DATEYR
IMG=`curl -s http://www.comics.com/comics/alleyoop/archive/alleyoop-${DATE}.html \
| egrep -m 1 -o "/comics/alleyoop/archive/images/alleyoop[0-9]+.(gif|jpg)"`
download http://www.comics.com$IMG AlleyOop${DATE}.${IMG##*.}
|
|
|
04-19-2008, 07:53 PM
|
#2
|
LQ Guru
Registered: May 2005
Location: Atlanta Georgia USA
Distribution: Redhat (RHEL), CentOS, Fedora, CoreOS, Debian, FreeBSD, HP-UX, Solaris, SCO
Posts: 7,831
|
Is the site login done using javascript? If so I don't believe it is doable. Most of the text browsers apparently can't interact with javascript.
Of course it was a couple of years ago when I did my check for curl, wget, lynx, links et all. Maybe someone else will have an answer by now but back then I even ran across one site that said not only didn't such a tool exist but they weren't sure one could ever be made.
|
|
|
All times are GMT -5. The time now is 11:43 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|