LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 12-29-2010, 09:23 AM   #1
mattseanbachman
Member
 
Registered: Feb 2010
Posts: 40

Rep: Reputation: 15
Wget: Downloading Pictures from a website


I'm trying to have wget retrieve the pics from a list of saved URLs. I have a list of facebook profiles from which I need the main profile picture saved.

Here is what I was working on, I did not get very far with it:

Code:
wget -A .jpg,.jpeg -erobots=off --user-agent="Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.6) Gecko/20070725 Firefox/2.0.0.6" -i urls.txt
The user agent is needed because without this each page will have an error on it stating that my browser is incompatible.

This is one of many attempts I have tried at this, all basic permutations of the included code.

So far, what is happening is that the pages I retrieve are full HTML, rather than what I want, which is simply the pictures.

The URLs I have are all of the form
Code:
http://www.domain.com/profile.php?id=xxxxxxxxxxx
. When I pull such up in my browser with the included wget command I see everything just fine; however, when I do it reading in a file (or even manually specifying a page to download), what I receive is the html file with everything intact minus the main photo of the page (that pages' user picture).


I believe I need the -A switch, but I think that is what is causing the issues (because the page is not a .jpg, it's getting deleted).

If someone has done the same thing as I am after, or has any helpful tips, I would greatly appreciate a nudge in the right direction.

Thanks.
 
Old 12-29-2010, 05:11 PM   #2
teckk
LQ Guru
 
Registered: Oct 2004
Distribution: Arch
Posts: 5,137
Blog Entries: 6

Rep: Reputation: 1826Reputation: 1826Reputation: 1826Reputation: 1826Reputation: 1826Reputation: 1826Reputation: 1826Reputation: 1826Reputation: 1826Reputation: 1826Reputation: 1826
I think that I recall not using a space between -A and .jpg did the job.

Something like
Code:
wget -r -A.jpg http://www.blah/blah
which will recursively get all .jpg files from blah blah.

If you want to call the links from a .txt file that you have made then
Code:
wget -r -A.jpg -i file.txt
You can also specify depth -l, no clobber -nc etc.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Downloading pictures to Ubuntu from cell phone robertlouis1960 Linux - General 10 07-31-2022 06:22 PM
wget searching and downloading images on a website ? memo007 Linux - Software 10 11-13-2008 10:46 AM
Downloading pictures from Olympus 720C UZ VRV Linux - Software 2 07-17-2006 07:21 AM
Automaticlly resizing pictures for website? fredgt Linux - Software 3 09-03-2004 02:45 AM
Downloading pictures from Finepix A200 darin3200 Linux - Hardware 1 03-20-2003 05:31 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 06:20 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration