[SOLVED] WGET issues downloading recursively from webpage
Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Location: Console.WriteLine("My location is {0}",Location);
Distribution: Arch Linux 64bit --Current
Posts: 33
Rep:
WGET issues downloading recursively from webpage
I have a website I want to wget but the 'links' are actually options in drop downs. Is there a way to force wget to realize those are relative links and follow them all?
Btw, I'm trying to download my lessons for my online APCS class for offline access which is what keeps me from getting a lot my work done when moving around, traveling, etc. and I can't get access to the lessons.
Attached is the page I'm trying to start the wget from.
Please let us know the exact link that you are typing with wget.
Also the attached file looks like a source of a web page. Are you trying to say that when you wget it did not download the file instead show you this page. If that is the case let us know the full link.
Location: Console.WriteLine("My location is {0}",Location);
Distribution: Arch Linux 64bit --Current
Posts: 33
Original Poster
Rep:
I forgot to mention that. I can give you the link but it will tell you 4.03 forbidden unless I give you the cookie information to access it.
I have that information, I can connect and begin downloading fine. It just only downloads some folders from that page onwards since I am not sure how to tell wget those links are part of the things it should be crawling.
The source file is there to help show the structure of the page and what I was trying to say in the first post about the unique structure of the links.
Last edited by chrisportela; 06-22-2011 at 04:41 PM.
Reason: additional information
Not sure if I have got your question correctly but what I can understand from your post is that you are trying to download a folder and it only download partial.
Location: Console.WriteLine("My location is {0}",Location);
Distribution: Arch Linux 64bit --Current
Posts: 33
Original Poster
Rep:
Well links aren't a issue so I'm just going to post a couple to explain the problem more. When you suggested didn't work not because wget doesn't work but I'm not sure how to configure it, or anything for that matter, to download what I want from this website
http://learn.flvs.net/webdav/educato...e03/03_03a.htm
^^
the link to a file that is likely forbidden for you, which is why it is probably not going to be of much help to the problem but it does show the folders. The first file I uploaded is at the root of educator_apcsa_v9 but because the 'links' (options in the drop downs) are not real links on the home page wget doesn't see the links. The way you navigate on that page is by selecting an option and the JS takes you there. Is there a way to tell wget to notice those links or will I need to find some other hack to get around the fact that it can't see the options? hopefully I don't need to specify each directory ...
Location: Console.WriteLine("My location is {0}",Location);
Distribution: Arch Linux 64bit --Current
Posts: 33
Original Poster
Rep:
nvm. I guess I just was being too lazy. All I need to do is select each directory for download. Doesn't take that long and if I wanted to be really lazy a shell script could do it for me. I'll mark this solved.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.