LinuxQuestions.org
View the Most Wanted LQ Wiki articles.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Slackware
User Name
Password
Slackware This Forum is for the discussion of Slackware Linux.

Notices

Reply
 
Search this Thread
Old 10-17-2003, 09:32 AM   #1
ksd
Member
 
Registered: Sep 2003
Location: lost in Eastern Kansas,USA
Distribution: FC3,Slackware ,ubuntu
Posts: 130

Rep: Reputation: 15
wget know how


hey..i just downloaded wget1.8. it s installed alright.but i am not getting the logic in downloading files wit h its help.i mean..do i have to use it through the console itself?when i did the wget --help..its asking for the url to type beside wget..how do i do it?does it not have any gui?is it not visible on the desktop or on the so that i can simply right click on the download link (just as i used do with download manager when windows was on my system)..has any one got any idea?
 
Old 10-17-2003, 09:42 AM   #2
jkobrien
Member
 
Registered: Jun 2003
Location: Dublin, Ireland
Distribution: Slackware, LFS, Ubuntu, RedHat, Slamd64
Posts: 507

Rep: Reputation: 30
No, wget doesn't have a gui (that I know of), it's used more for non-interactive downloads. So if you wanted to send a batch of queries to some online web-site and download the results, you would write a script that prepared your queries, sent them to the web page with wget and saved the results somewhere. All this without you having to be there, or even logged on.

There's tons more information under "man wget" and "locate wget | grep README".

gftp is a good package for more straightforward file retrieval.

john
 
Old 10-17-2003, 10:13 AM   #3
ksd
Member
 
Registered: Sep 2003
Location: lost in Eastern Kansas,USA
Distribution: FC3,Slackware ,ubuntu
Posts: 130

Original Poster
Rep: Reputation: 15
hey jkobrien..thanx for that info..i'd better refer man wget..i'll also try gftp..ok..
 
Old 10-17-2003, 01:50 PM   #4
JollyRogers
Member
 
Registered: Sep 2003
Location: Va USA
Distribution: Slackware
Posts: 81

Rep: Reputation: 16
I like wget. It is only usable from the commandline, however, I have seen ncurser programs that utilize wget to grab files etc....

an example of how to use it..

$wget http://www.ultrasoul.com/~jollyrogers/fireballmail2.mp3
 
Old 10-17-2003, 02:03 PM   #5
arunshivanandan
Member
 
Registered: May 2003
Location: Kerala,India
Distribution: RedHat,Mandrake,Debian
Posts: 643

Rep: Reputation: 30
wget is superb.you can even add all the urls of the files to be downloaded to a file say temp and just give 'wget -i temp' and it will download everything.it is extremely usefull if you are in a lan since it will continue downloading in the background even after you have logged out.(therefore it is in noway comparable to gftp).if you are not working in a lan,then download accelerators like prozilla or prozGUI will be more usefull.(see my sig).I dont know of any application that combines these two.(Why are they not coupling it??)
--arun

Last edited by arunshivanandan; 10-17-2003 at 02:30 PM.
 
Old 10-17-2003, 02:18 PM   #6
arunshivanandan
Member
 
Registered: May 2003
Location: Kerala,India
Distribution: RedHat,Mandrake,Debian
Posts: 643

Rep: Reputation: 30
also use -b option with wget to run it in background,-c to continue a partially downloaded file, --proxy=on/off to set proxy, ' --directory-prefix=<DIR_NAME> to specify the destination directory, --http-user=USE_NAME ,--http-passwd=password to specify the user name and password to which one should logon.for more refer man page.
 
Old 10-17-2003, 02:45 PM   #7
gwp
Newbie
 
Registered: Oct 2003
Location: South Africa
Distribution: Redhat, Fedora, Ubuntu
Posts: 27

Rep: Reputation: 15
Try this

If your access is via a proxy:

export http_proxy=<proxyip or dns name>

Then

wget --convert-links --proxy-user=youruser --proxy-passwd=yourpwd -r http://yoururl

Hope that helps...
 
Old 10-17-2003, 04:52 PM   #8
fye
LQ Newbie
 
Registered: Sep 2003
Distribution: Slackware 9.1
Posts: 13

Rep: Reputation: 0
Quote:
Originally posted by arunshivanandan
--http-user=USE_NAME ,--http-passwd=password
wget is great and I use it quite often, but I'd still prefer it to ask me the password if I just entered the username. This way I wouldn't have to put my passwd to the commandline, which is not very secure, imo. Interesting, I looked at the man page, but there was no mention of such a usage. Maybe I missed something, because it is insane to make passing the password as the argument the only way to send a passwd.
 
Old 10-17-2003, 11:58 PM   #9
ksd
Member
 
Registered: Sep 2003
Location: lost in Eastern Kansas,USA
Distribution: FC3,Slackware ,ubuntu
Posts: 130

Original Poster
Rep: Reputation: 15
hey arun shivanandan..thanx for that info ..i would refer the manpage of wget..
 
Old 10-18-2003, 02:48 AM   #10
arunshivanandan
Member
 
Registered: May 2003
Location: Kerala,India
Distribution: RedHat,Mandrake,Debian
Posts: 643

Rep: Reputation: 30
Quote:
Originally posted by fye
, I looked at the man page, but there was no mention of such a usage. Maybe I missed something, because it is insane to make passing the password as the argument the only way to send a passwd.
Quoting from wget manual page,
Quote:
--http-user=user --http-passwd=password Specify the username user and password password on an HTTP server. According to the type of the challenge, Wget will encode them using either the `basic' (inse_ cure) or the `digest' authentication scheme. Another way to specify username and password is in the URL itself.
Is your manpage different from mine??(mine is wget 1.7)
 
Old 10-18-2003, 05:01 AM   #11
gwp
Newbie
 
Registered: Oct 2003
Location: South Africa
Distribution: Redhat, Fedora, Ubuntu
Posts: 27

Rep: Reputation: 15
According my man page (version 1.8.2)

Quote:
--proxy-user=user
--proxy-passwd=password
Specify the username user and password password for authentication
on a proxy server. Wget will encode them using the "basic" authen-
tication scheme.

Security considerations similar to those with --http-passwd pertain
here as well.
Although I do concur that passwords on the command line are a big no no

Just though I'd help get him/her started

Don't use it much, but has been useful when I often refer to something on a site.

What I'll often do is use --no-parent when I don't want to ascend and get a bunch
of stuff I don't need

I'm no master with wget, but its a cool util

BR,

G
 
Old 10-18-2003, 08:13 AM   #12
senthilkumaran
LQ Newbie
 
Registered: Oct 2003
Location: Bangalore
Distribution: Slackware
Posts: 3

Rep: Reputation: 0
hi
wget is an exceptional command line tool mainly for ftp doing downloads. you can even write scripts using wget and add in crond for automatic downloads. It has the capability to resume the connection if it get disconnects.

hope this example will give an idea to use wget.

wget -c -nH -P /home/anyfolder -a /logs/LOG_FILE_NAME.log ftp://usernameassword@sitename/FLDR_NAME/*.[t,d][x,o][t,c] && echo "FLDR_NAME Sucessfully Downloaded"

-c will resume the connection (The ftp server should have the resume connection option enabled)
-nH will cut the remote parent directory and takes the rest.
for ex if you try downloading files form ftp://ftp.sitename/download the directory structure will be created on hte local machine also. -nH will not create the folder name "ftp.sitename" instead it uses the -P option which tells the prefix to be used.

-a will write the file transfer details in the given logfile.

*.[t,d][x,o][t,c] will download only 'txt' and 'doc' file if any.

if you are using the proxy server
add the proxy details of http_proxy and ftp_proxy in the 'wgetrc' file before you use the command.

Also refer the man page for more options.
regards,
 
Old 10-18-2003, 02:14 PM   #13
andrewlkho
Member
 
Registered: Jul 2003
Location: London
Posts: 548

Rep: Reputation: 30
Hey..that's a good idea actually [sorry, this is completely random] - anyone fancy writing a gui for wget [mainly for batch downloads for script newbies]?
 
Old 10-18-2003, 02:43 PM   #14
arunshivanandan
Member
 
Registered: May 2003
Location: Kerala,India
Distribution: RedHat,Mandrake,Debian
Posts: 643

Rep: Reputation: 30
if someone is going to work on wget,try applying the concept of download accelarator to wget.(like prozilla).that is the look should be something like that og gftp,it should have all the properties of present wget(like continue downloading even when the user has logged out) and that of download accelarator.How nice will be that??
 
Old 10-18-2003, 07:34 PM   #15
LinFreak!
Member
 
Registered: Jul 2003
Location: England
Distribution: slack9.1
Posts: 209

Rep: Reputation: 30
Great idea, if only I learned to script.....
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
can't wget!! aru_04 Linux - General 5 08-13-2005 06:07 AM
wget noir911 Linux - Newbie 8 07-30-2005 09:57 AM
wget Harp00 Linux - Newbie 4 11-15-2004 08:27 PM
wget toastermaker Linux - Software 4 11-13-2004 11:59 AM
wget filex Linux - Security 4 09-08-2004 09:02 PM


All times are GMT -5. The time now is 07:24 PM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration