How to download web pages from a website using wget command
I want to store on my system some webpages of yahoo.com, (The home page, jpeg files on the homepage and hyperlinks from homepage at a depth of 5 levels and view them offline.
I tried as following
[amit@localhost temp]$ wget -l
http://www.yahoo.com
wget: reclevel: Invalid specification `http://www.yahoo.com'.
wget: missing URL
Usage: wget [OPTION]... [URL]...
Try `wget --help' for more options.
[amit@localhost temp]$
but it is not working.