wget know how
hey..i just downloaded wget1.8. it s installed alright.but i am not getting the logic in downloading files wit h its help.i mean..do i have to use it through the console itself?when i did the wget --help..its asking for the url to type beside wget..how do i do it?does it not have any gui?is it not visible on the desktop or on the so that i can simply right click on the download link (just as i used do with download manager when windows was on my system)..has any one got any idea?
|
No, wget doesn't have a gui (that I know of), it's used more for non-interactive downloads. So if you wanted to send a batch of queries to some online web-site and download the results, you would write a script that prepared your queries, sent them to the web page with wget and saved the results somewhere. All this without you having to be there, or even logged on.
There's tons more information under "man wget" and "locate wget | grep README". gftp is a good package for more straightforward file retrieval. john |
hey jkobrien..thanx for that info..i'd better refer man wget..i'll also try gftp..ok..
|
I like wget. It is only usable from the commandline, however, I have seen ncurser programs that utilize wget to grab files etc....
an example of how to use it.. $wget http://www.ultrasoul.com/~jollyrogers/fireballmail2.mp3 |
wget is superb.you can even add all the urls of the files to be downloaded to a file say temp and just give 'wget -i temp' and it will download everything.it is extremely usefull if you are in a lan since it will continue downloading in the background even after you have logged out.(therefore it is in noway comparable to gftp).if you are not working in a lan,then download accelerators like prozilla or prozGUI will be more usefull.(see my sig).I dont know of any application that combines these two.(Why are they not coupling it??)
--arun |
also use -b option with wget to run it in background,-c to continue a partially downloaded file, --proxy=on/off to set proxy, ' --directory-prefix=<DIR_NAME> to specify the destination directory, --http-user=USE_NAME ,--http-passwd=password to specify the user name and password to which one should logon.for more refer man page.
|
Try this
If your access is via a proxy: export http_proxy=<proxyip or dns name> Then wget --convert-links --proxy-user=youruser --proxy-passwd=yourpwd -r http://yoururl Hope that helps... |
Quote:
|
hey arun shivanandan..thanx for that info ..i would refer the manpage of wget..
|
Quote:
Quote:
|
According my man page (version 1.8.2)
Quote:
Just though I'd help get him/her started Don't use it much, but has been useful when I often refer to something on a site. What I'll often do is use --no-parent when I don't want to ascend and get a bunch of stuff I don't need I'm no master with wget, but its a cool util BR, G |
hi
wget is an exceptional command line tool mainly for ftp doing downloads. you can even write scripts using wget and add in crond for automatic downloads. It has the capability to resume the connection if it get disconnects. hope this example will give an idea to use wget. wget -c -nH -P /home/anyfolder -a /logs/LOG_FILE_NAME.log ftp://username:password@sitename/FLDR_NAME/*.[t,d][x,o][t,c] && echo "FLDR_NAME Sucessfully Downloaded" -c will resume the connection (The ftp server should have the resume connection option enabled) -nH will cut the remote parent directory and takes the rest. for ex if you try downloading files form ftp://ftp.sitename/download the directory structure will be created on hte local machine also. -nH will not create the folder name "ftp.sitename" instead it uses the -P option which tells the prefix to be used. -a will write the file transfer details in the given logfile. *.[t,d][x,o][t,c] will download only 'txt' and 'doc' file if any. if you are using the proxy server add the proxy details of http_proxy and ftp_proxy in the 'wgetrc' file before you use the command. Also refer the man page for more options. regards, |
Hey..that's a good idea actually [sorry, this is completely random] - anyone fancy writing a gui for wget [mainly for batch downloads for script newbies]?
|
if someone is going to work on wget,try applying the concept of download accelarator to wget.(like prozilla).that is the look should be something like that og gftp,it should have all the properties of present wget(like continue downloading even when the user has logged out) and that of download accelarator.How nice will be that??
|
Great idea, if only I learned to script.....
|
All times are GMT -5. The time now is 06:12 PM. |