WGET to download images
I have two CSV files. One contains a url to an image and the other the name i want the image downloaded and saved as. They are two files as it is 3.6 million rows long and I cant combine without blowing up my computer.
Is there a way to use a wget command to download link and rename the file? Both files can be viewed from the below links https://www.dropbox.com/s/fis8srwdtm87y7i/url.csv?dl=0 https://www.dropbox.com/s/9k2f7zf11q...Names.csv?dl=0 any help appreciated. Jodie |
have you tried changing 'dl=0' to 'dl=1'?
|
Don't know if these are time sensitive or not.
Code:
wget --spider "https://uc3af9f647b4bcb93f2c716f7d0f.dl.dropboxusercontent.com/cd/0/get/Aiqw78JPywxbsMVjYPY8kiJm-HTa_p5XddLQ6eZi4VhF2XoNQQqNzyvoj3KIQUyQ6Fe8ThO5Q-1H35arCeXumhsXtZB4FW5aW1PeMq638xExdA/file?_download_id=1407438504576448139996869513670874382657307170413355605886783133243&_notify_domain=www.dropbox.com&dl=1" Code:
wget --spider "https://uccaa835e0962e78259d66cfbd81.dl.dropboxusercontent.com/cd/0/get/AipgxLQRcOaW9UGzsgtyyS20eSKfa36iegeVLzCJ-_8mAAp-1lB3WLh-UI8CQ8RvTDMQEJdh7NxJFqwFA2tI1lDk3-C30fnsXBfXl7r4FzhwFw/file?_download_id=63981500829626411254479516742057535279546930342219265882653689927&_notify_domain=www.dropbox.com&dl=1" |
Quote:
|
Quote:
|
Hi,
For combining the two files line by line you can use paste. My guess is that it won't blew up your computer, as paste should work with chunk of the two inputs written as chunk of output, without filling up the whole memory. |
I used to use wget to download a list of URLs in a file.
$ wget -c -i file_of_urls.txt Back when I was on dialup and the local rest areas had free wifi with much better connection speeds. Or the local library parking lot. Although back then the laptops battery would only last an hour, so I had to get as much as I could as fast as I could before returning to slumming it status (dialup at home). You could script it to mv/rename the file after download, but you'd probably need 3 parameters per line in the file. And a variant with only the 1 parameter (URL) for wget. The 2nd and 3rd being what actually downloaded (without URL and $PATH) and it's new name. $ cat FILE.txt | while read LINE; do OLDNAME=$(echo $LINE | awk '{ print $2; }'); NEWNAME=$(echo $LINE | awk '{ print $3; }'; echo $OLDNAME" --- "$NEWNAME; mv $OLDNAME $NEWNAME; done or something like that. Adding an awk for $1 for the URL if you wanted to run wget once per line to avoid a file that wget couldn't use directly. |
All times are GMT -5. The time now is 01:59 PM. |