LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   General (https://www.linuxquestions.org/questions/general-10/)
-   -   Why can't I use wget? (https://www.linuxquestions.org/questions/general-10/why-cant-i-use-wget-60872/)

TruckStuff 05-20-2003 05:04 PM

Why can't I use wget?
 
For the life of me, I can't get wget to work on my system. I am trying to run a mirror of an existing project hosted on another domain. I would love to be able to simply download all the files in a given folder. Here is an example of what I see:

Quote:

[root@localhost php]# wget http://www.mydomain.com/php/
--17:04:55-- http://www.mydomain.com/php/
=> `index.html'
Resolving www.mydomain.com... done.
Connecting to www.mydomain.com[x.x.x.x]:80... connected.
HTTP request sent, awaiting response... 403 Access Forbidden
17:04:55 ERROR 403: Access Forbidden.
OK, so I use my uname and passwd:
Quote:

[root@localhost php]# wget --http-user=me --http-passwd=pass http://www.mydomain.com/php/
--17:07:03-- http://www.mydomain.com/php/
=> `index.html.2'
Resolving www.mydomain.com... done.
Connecting to www.mydomain.com[x.x.x.x]:80... connected.
HTTP request sent, awaiting response... 403 Access Forbidden
17:07:03 ERROR 403: Access Forbidden.
I can succesfully login using this user name and password any other time. I feel like a complete bonehead here... everything I read says "use wget, its easy", but it seems useless to me. I'm sure I'm just misusing it. Tips are appreicated.

serz 05-20-2003 09:17 PM

You're trying to download the content of a directory? Cause with a normal file that should ahve worked.

I think you have to use the -r option, it's recursively.

Hope that helps.

TruckStuff 05-21-2003 08:57 AM

I tried adding the -r extension and get the same error about access being forbidden. If I specify a page, e.g. www.mydomain.com/php/index.php, I get 4 files downloaded, all of which are linked to from my index.php. I have also tried several different options with recurssion level, getting page requisites, etc. and the max number of files it has ever downloaded is 5. Surely there is some way to mirror a web site!

trickykid 05-21-2003 10:45 AM

Its right in front of you in the man pages... or type wget --help for more options..

wget -m www.mydomain.com

-m = mirror

It will get all the files and directories though from the whole domain from my understanding though. Check the man pages for more details, its pretty straight forward.

TruckStuff 05-21-2003 03:04 PM

Sorry, I forgot to list the -m option before as having tried that. Still get the same error. I did read the man pages. :)

fancypiper 05-21-2003 03:13 PM

Can the site you are trying to mirror give you permission for the directories that they have restricted?

caffiendo 01-26-2007 09:47 PM

I am having the same issue with a site I use. The site is a wholesale company that I use and I have full log-in access with a Username and Password. Is there a way to use WGET in conjunction with a password?

mikluha 02-24-2009 01:39 PM

the same issue
 
hi guys. I have the same issue.

If i try download something as root using wget, wget gets stuck and then timeouts...
But if i try to download using wget as ordinary user then all
just OK!

TruckStuff: maybe you try download site as ordinary user, and then
copy to whatever you want?


All times are GMT -5. The time now is 10:15 PM.