wget problems
hi not sure if this is the right place to ask this question but here goes...
im currently having problems mirroring a web site with wget (or any other program for that mater). I have used wget many times before without any problem so i dont think its anything i am doing wrong. All that happens is it only grabs the first page (index.html). the command i use in this case is: Code:
wget -r -m "name of site" |
If name of the site is say... "LinuxQuestions" it's going to fail, that's not a url. You're not providing any details on what the problem actually is or errors or well anything other than its not working so its pretty hard to help you....
|
thanks for the quick reply.
I understand its not exactly a linux related question although i was hoping that i would get some help. i am not sure what other information to give other than the url to the site (greenroomswirral.co.uk). if you need any other information then i am happy to provide it. |
What is needed is that you copy/paste the exact command that you entered as well as any messages from wget.
|
Tried it from here---It seems to be downloading many different pages (I stopped it before it finished)
|
Code:
wget -r -m www.greenroomswirral.co.uk this is the output Code:
lina@lina-laptop:~/Desktop/new$ wget -r -m www.greenroomswirral.co.uk Quote:
|
I did it without the "www"---and got many many pages.
WITH the "www", I get your result. I have NO CLUE why the difference |
That works perfect though i am not sure why...(would be interested to know why though).
Thanks very much for all you help and patience. |
All times are GMT -5. The time now is 07:34 PM. |