LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 04-13-2010, 03:40 PM   #1
linf
LQ Newbie
 
Registered: Sep 2008
Posts: 5

Rep: Reputation: 0
wget problems


hi not sure if this is the right place to ask this question but here goes...
im currently having problems mirroring a web site with wget (or any other program for that mater). I have used wget many times before without any problem so i dont think its anything i am doing wrong. All that happens is it only grabs the first page (index.html).
the command i use in this case is:

Code:
wget -r -m "name of site"
any help would be much appreciated.
 
Old 04-13-2010, 03:52 PM   #2
rweaver
Senior Member
 
Registered: Dec 2008
Location: Louisville, OH
Distribution: Debian, CentOS, Slackware, RHEL, Gentoo
Posts: 1,833

Rep: Reputation: 164Reputation: 164
If name of the site is say... "LinuxQuestions" it's going to fail, that's not a url. You're not providing any details on what the problem actually is or errors or well anything other than its not working so its pretty hard to help you....
 
Old 04-13-2010, 04:06 PM   #3
linf
LQ Newbie
 
Registered: Sep 2008
Posts: 5

Original Poster
Rep: Reputation: 0
thanks for the quick reply.
I understand its not exactly a linux related question although i was hoping that i would get some help. i am not sure what other information to give other than the url to the site (greenroomswirral.co.uk). if you need any other information then i am happy to provide it.
 
Old 04-13-2010, 04:09 PM   #4
harryhaller
Member
 
Registered: Sep 2004
Distribution: Slackware-14.0
Posts: 452

Rep: Reputation: Disabled
What is needed is that you copy/paste the exact command that you entered as well as any messages from wget.
 
Old 04-13-2010, 04:20 PM   #5
pixellany
LQ Veteran
 
Registered: Nov 2005
Location: Annapolis, MD
Distribution: Arch/XFCE
Posts: 17,802

Rep: Reputation: 738Reputation: 738Reputation: 738Reputation: 738Reputation: 738Reputation: 738Reputation: 738
Tried it from here---It seems to be downloading many different pages (I stopped it before it finished)
 
Old 04-13-2010, 04:44 PM   #6
linf
LQ Newbie
 
Registered: Sep 2008
Posts: 5

Original Poster
Rep: Reputation: 0
Code:
wget -r -m www.greenroomswirral.co.uk
wget gives no errors it just stops at the first page (index.html). i have never seen it do this before with any other site.

this is the output

Code:
lina@lina-laptop:~/Desktop/new$ wget -r -m www.greenroomswirral.co.uk
--2010-04-13 21:19:43--  http://www.greenroomswirral.co.uk/
Resolving www.greenroomswirral.co.uk... 79.170.44.108
Connecting to www.greenroomswirral.co.uk|79.170.44.108|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Last-modified header missing -- time-stamps turned off.
--2010-04-13 21:19:43--  http://www.greenroomswirral.co.uk/
Connecting to www.greenroomswirral.co.uk|79.170.44.108|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: `www.greenroomswirral.co.uk/index.html'

    [  <=>                                  ] 14,137      45.9K/s   in 0.3s    

2010-04-13 21:19:44 (45.9 KB/s) - `www.greenroomswirral.co.uk/index.html' saved [14137]

FINISHED --2010-04-13 21:19:44--
Downloaded: 1 files, 14K in 0.3s (45.9 KB/s)
lina@lina-laptop:~/Desktop/new$
Quote:
Tried it from here---It seems to be downloading many different pages (I stopped it before it finished)
i have tried this on a few different computer with different networks with the same result so i would be interested in the commands you have used

Last edited by linf; 04-13-2010 at 04:58 PM.
 
Old 04-13-2010, 06:10 PM   #7
pixellany
LQ Veteran
 
Registered: Nov 2005
Location: Annapolis, MD
Distribution: Arch/XFCE
Posts: 17,802

Rep: Reputation: 738Reputation: 738Reputation: 738Reputation: 738Reputation: 738Reputation: 738Reputation: 738
I did it without the "www"---and got many many pages.

WITH the "www", I get your result.

I have NO CLUE why the difference
 
1 members found this post helpful.
Old 04-13-2010, 07:09 PM   #8
linf
LQ Newbie
 
Registered: Sep 2008
Posts: 5

Original Poster
Rep: Reputation: 0
That works perfect though i am not sure why...(would be interested to know why though).
Thanks very much for all you help and patience.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Problems mirroring an asp website with wget el_pajaro! Linux - General 1 02-17-2008 12:19 AM
wget - problems making it ignore a link jlpktnst Linux - Software 2 06-09-2006 07:31 AM
wget/curl problems PLS HELP tommmmmm Linux - Software 0 08-19-2005 04:58 AM
wget/curl problems. redirection after post method tommmmmm Linux - Software 1 08-10-2005 11:36 AM
wget problems bmeckle Linux - Software 3 08-05-2005 02:52 PM


All times are GMT -5. The time now is 03:01 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration