LinuxQuestions.org
Support LQ: Use code LQ3 and save $3 on Domain Registration
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices

Reply
 
Search this Thread
Old 12-20-2007, 06:09 PM   #1
xmrkite
Member
 
Registered: Oct 2006
Location: California, USA
Distribution: Mint 16, Lubuntu 14.04, Mythbuntu 14.04, Kubuntu 13.10, Xubuntu 10.04
Posts: 542

Rep: Reputation: 30
how do i wget a url with spaces


hello, i am trying to do two things, one is read url's from a file, then wget (download) them all.

Here is my code:
Code:
url_file=/home/xmrkite/filestodownload.txt

   cat $url_file | while
   read url
   
   do
      VAR2="`ps | grep feh | nawk '{ print $1}'`"
      wget -O /home/xmrkite/$url http://www.website.com/ftp/$url
The problem is that when the filenames on the website have a space in them, wget gives me a 400 bad request error. How can i fix this?

--Thanks


PS. I also want to have it download the files in the txt file from random lines instead of going in order...but not knowing beforehand how long the file will be, how can i do that? The file has one filename per line, all the files being in the same directory on the webserver.
 
Old 12-20-2007, 06:12 PM   #2
gilead
Senior Member
 
Registered: Dec 2005
Location: Brisbane, Australia
Distribution: Slackware64 14.0
Posts: 4,123

Rep: Reputation: 162Reputation: 162
Have you tried wrapping the URL strings in quotes?
Code:
wget -O "/home/xmrkite/$url" "http://www.website.com/ftp/$url"
 
Old 12-20-2007, 06:28 PM   #3
xmrkite
Member
 
Registered: Oct 2006
Location: California, USA
Distribution: Mint 16, Lubuntu 14.04, Mythbuntu 14.04, Kubuntu 13.10, Xubuntu 10.04
Posts: 542

Original Poster
Rep: Reputation: 30
I tried that, but it doesn't work. I'm really stumped.

I had tried using ' ` and " in various arrangements, but to no avail. Any other ideas?

I know the url is good cause i can copy it into firefox and get the file. So something else must be going wrong.
 
Old 12-20-2007, 06:48 PM   #4
gilead
Senior Member
 
Registered: Dec 2005
Location: Brisbane, Australia
Distribution: Slackware64 14.0
Posts: 4,123

Rep: Reputation: 162Reputation: 162
The quotes work for me - did you get an error message when you tried it? Failing that, you may be able to convert the space characters to %20 (I haven't tried that here).
 
Old 12-20-2007, 07:00 PM   #5
chrism01
Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Centos 6.5, Centos 5.10
Posts: 16,311

Rep: Reputation: 2040Reputation: 2040Reputation: 2040Reputation: 2040Reputation: 2040Reputation: 2040Reputation: 2040Reputation: 2040Reputation: 2040Reputation: 2040Reputation: 2040
Try escaping the spaces with \ char first
 
Old 12-20-2007, 07:17 PM   #6
xmrkite
Member
 
Registered: Oct 2006
Location: California, USA
Distribution: Mint 16, Lubuntu 14.04, Mythbuntu 14.04, Kubuntu 13.10, Xubuntu 10.04
Posts: 542

Original Poster
Rep: Reputation: 30
OK, it worked fine on my ubuntu computer, but on my DSL Linux computer, it didn't work. Why they heck would this be. I just don't get it. Any other ideas???

I tried just a simple

wget "http://www.website.com/file with spaces.txt"

And on dsl linux, it didn't work.

-Thanks
 
Old 12-20-2007, 07:51 PM   #7
Brian1
Guru
 
Registered: Jan 2003
Location: Seymour, Indiana
Distribution: Distribution: RHEL 5 with Pieces of this and that. Kernel 2.6.23.1, KDE 3.5.8 and KDE 4.0 beta, Plu
Posts: 5,700

Rep: Reputation: 61
Might run the command on each machine to get the version number.
wget --version

Now check the developers site and see if there is any issues with the one that does not work right or maybe install the latest wget.

Brian
 
Old 12-21-2007, 01:15 PM   #8
xmrkite
Member
 
Registered: Oct 2006
Location: California, USA
Distribution: Mint 16, Lubuntu 14.04, Mythbuntu 14.04, Kubuntu 13.10, Xubuntu 10.04
Posts: 542

Original Poster
Rep: Reputation: 30
ok, i installed the gnu-utils program from the mydsl extension loader, and that solved my problem. They probably just put in a trimmed down version of the program to save space. DSL Linux is very small after all.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
wget fails when i want to download from a URL which contains "=" or "&' noware Linux - General 7 11-13-2005 08:35 AM
wget missing url ergo_sum Linux - Newbie 2 06-07-2005 08:35 PM
wget url format Gilion Linux - Software 2 12-01-2003 09:26 AM
browser does not convert query_string/url spaces to '+', seeking workaround? Sm0k3 Programming 0 11-25-2003 06:44 PM
wget...url-encoded filenames linen0ise Slackware 1 10-26-2003 09:22 AM


All times are GMT -5. The time now is 08:56 PM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration