GeneralThis forum is for non-technical general discussion which can include both Linux and non-Linux topics. Have fun!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
For the life of me, I can't get wget to work on my system. I am trying to run a mirror of an existing project hosted on another domain. I would love to be able to simply download all the files in a given folder. Here is an example of what I see:
I can succesfully login using this user name and password any other time. I feel like a complete bonehead here... everything I read says "use wget, its easy", but it seems useless to me. I'm sure I'm just misusing it. Tips are appreicated.
I tried adding the -r extension and get the same error about access being forbidden. If I specify a page, e.g. www.mydomain.com/php/index.php, I get 4 files downloaded, all of which are linked to from my index.php. I have also tried several different options with recurssion level, getting page requisites, etc. and the max number of files it has ever downloaded is 5. Surely there is some way to mirror a web site!
It will get all the files and directories though from the whole domain from my understanding though. Check the man pages for more details, its pretty straight forward.
I am having the same issue with a site I use. The site is a wholesale company that I use and I have full log-in access with a Username and Password. Is there a way to use WGET in conjunction with a password?
If i try download something as root using wget, wget gets stuck and then timeouts...
But if i try to download using wget as ordinary user then all
just OK!
TruckStuff: maybe you try download site as ordinary user, and then
copy to whatever you want?
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.