Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
|
01-13-2009, 03:46 PM
|
#1
|
LQ Newbie
Registered: Oct 2008
Distribution: Mint
Posts: 22
Rep:
|
Scripting: How to put wget in foreground?
Hi there.
I am using wget in a script for downloading stuff from sourceforge (http).
It seems to start in background, so that the script continues.
This makes the script fail.
How do I get wget in the forground? Or is there another solution to it.
Somehow downloads from ftp server do work.
THX
|
|
|
01-13-2009, 03:58 PM
|
#2
|
Moderator
Registered: Jun 2001
Location: UK
Distribution: Gentoo, RHEL, Fedora, Centos
Posts: 43,417
|
nothing will fork unless you tell it to... maybe if you actually showed us your script you'd be able to get more help... with urls from sf.net and such, are there maybe some &'s in the url that you're not handling properly?
|
|
|
01-13-2009, 04:00 PM
|
#3
|
Member
Registered: May 2006
Location: BE
Distribution: Debian/Gentoo
Posts: 412
Rep:
|
Hi,
Try using the command wait. It should do the job.
I found the following link for an example on how to use it: link
|
|
|
01-13-2009, 04:36 PM
|
#4
|
LQ Newbie
Registered: Oct 2008
Distribution: Mint
Posts: 22
Original Poster
Rep:
|
Hi.
This is the function I use in the script.
Code:
download () {
header "Download data"
cd $SOXSRCDIR
wget -nd -v http://downloads.sourceforge.net/sox/sox-14.2.0.tar.gz?modtime=1226179484&big_mirror=0 >>$LOG
tar -xvvf $SOXPACK >>$LOG
}
I also introduced background = off in /etc/wgetrc - NO change.
I'll checkout the "wait" hint. Even though I don't understand, why it is
not working.
THX so far.
Cheers
|
|
|
01-13-2009, 04:50 PM
|
#5
|
LQ Newbie
Registered: Oct 2008
Distribution: Mint
Posts: 22
Original Poster
Rep:
|
"wait" did the trick.
THX a lot.
|
|
|
01-13-2009, 04:52 PM
|
#6
|
Moderator
Registered: Jun 2001
Location: UK
Distribution: Gentoo, RHEL, Fedora, Centos
Posts: 43,417
|
well wait isn't the right way to solve this at all... As i guessed, the & in the url is causing a fork, so everything after that is being treated as a new command, which will fail and give an error before moving on. if the url works with the & and everything past it removed, then remove it.
|
|
|
01-13-2009, 05:35 PM
|
#7
|
Member
Registered: May 2006
Location: BE
Distribution: Debian/Gentoo
Posts: 412
Rep:
|
... or put a single quote on either side of the URL.
|
|
|
01-13-2009, 05:53 PM
|
#8
|
LQ Newbie
Registered: Oct 2008
Distribution: Mint
Posts: 22
Original Poster
Rep:
|
THX a lot folks. A real beginner problem.
When I look at it now - it seems to be more than obvious.
Cheers
|
|
|
All times are GMT -5. The time now is 07:49 PM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|