GeneralThis forum is for non-technical general discussion which can include both Linux and non-Linux topics. Have fun!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have access to a pay-site hosting thousands of public domain images. Since the pay is only for the access (after all, the images are PD), I should be able to download and freely distribute them to all.
Now it's just a simple matter of putting that into practice. To download an image normally, it opens a page up, which fetches the image through javascript and crafty document.write disguises. In order to save the actual file, you have to right-click the image and go to Save As (in Windows, of course).
So, as a first attempt, I determined how the page URLs were generated (all end with a number that increases by 1 for each consecutive page) - the image urls each have unique hashes, so I couldn't touch them. Then I used a program to generate URLs for me (called urlgen). Then I used a regex editor to put html tags around the url's. Then I tried using Firefox's DownThemAll extension to download each page, hoping it would also download the page's content. It didn't. It only downloaded the html of the page.
I know this would be a helluva lot easier to do in Linux, but in Windows, are there any suggestions for accomplishing this? I'm thinking next I'll try one of those keyboard/mouse button combination recorders, but was hoping someone had a simpler (Windows-based) solution.
Thanks!
Last edited by brian0918; 09-09-2007 at 12:28 AM.
Reason: typo
Some paysites actually prefer to make a profit. Having people do mass downloads of whole sites consumes a lot of bandwidth, and that costs money. Guess why it isn't recommended...
As for you freely distributing the pictures,... hmm. Copyright and distribution rights come to mind.
So, as a first attempt, I determined how the page URLs were generated (all end with a number that increases by 1 for each consecutive page) - the image urls each have unique hashes, so I couldn't touch them. Then I used a program to generate URLs for me (called urlgen). Then I used a regex editor to put html tags around the url's.
Did I understand you correctly you havel all URL's you needed? Then make the list and feed wget with it.
Some paysites actually prefer to make a profit. Having people do mass downloads of whole sites consumes a lot of bandwidth, and that costs money. Guess why it isn't recommended...
As for you freely distributing the pictures,... hmm. Copyright and distribution rights come to mind.
You do realize that these images are not copyrightable, right? They're from the 1800s. Please read up on the concept of "public domain". The pay-site is simply forcing you to pay for initial access to the content. Once you have access to the images, you can do whatever you want with them.
Even though I kinda condone (Or whatever you use to politely say "Don't do that, bra") that, I sometimes find myself in the same position.
Have you tried wget?
Hrmmm ... to condone is to agree with. To condemn is to say "naughty naughty". Besides, what are you doing scolding women's undergarments? What did they ever do to you? Wierd ...
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.