LinuxQuestions.org
Did you know LQ has a Linux Hardware Compatibility List?
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices

Reply
 
Search this Thread
Old 03-11-2009, 03:23 AM   #1
vinayakm
LQ Newbie
 
Registered: Mar 2009
Posts: 4

Rep: Reputation: 0
Simplify this shell script


Hi Friends

I have written the below mentioned shell script to basically download files from the url mentioned in the file ur.txt and after completion of downloading the entire file move it to a specified directory. I would like to know whether there is any shorter way to do it instead of the long about method I have followed in the Script. One more problem which I am facing is that when I run this script through a ssh session using putty and I close the session the current file which is being downloaded get downloaded completely but after that no other files get downloaded. Can you please help me out.


Code:
i=0
b=0
c=""
while [ `find ur.txt -size +0` ]
 do
  url=`head -n1 ur.txt`
   wget -c --http-user=xxx --http-password=abc  $url
for((i=1;i<${#url};i++))
	do
	c=${url:$i:1}
	if [ $c == "/" ]; then
	b=$(($i))
	fi
	done
	b=$(($b+1))
	c=${url:$b}
	mv -v $c t
   sed -si 1d ur.txt
 done
Thanks
Vinayakm
 
Old 03-11-2009, 04:19 AM   #2
eco
Member
 
Registered: May 2006
Location: BE
Distribution: Debian/Gentoo
Posts: 412

Rep: Reputation: 48
To avoid killing the session, use screen. It will allow you do disconnect and reconnect to a session at will as well as a few other nifty things.

As for the script, you could start by putting meaningful variables

Last edited by eco; 03-11-2009 at 04:21 AM.
 
Old 03-11-2009, 04:29 AM   #3
weibullguy
ReliaFree Maintainer
 
Registered: Aug 2004
Location: Kalamazoo, Michigan
Distribution: Slackware-current, Cross Linux from Scratch, Gentoo
Posts: 2,705
Blog Entries: 1

Rep: Reputation: 220Reputation: 220Reputation: 220
Well, you could add the following to the wget command to eliminate the mv -v statement
Code:
--directory-prefix=<DIRECTORY_TO_STORE_TARBALLS>
Instead of parsing the URL from the file, you could add the following to the wget command
Code:
-i ../ur.txt
If you do that, your script could be one line...the wget command.

And I agree with eco, your variables are meaningless.

Last edited by weibullguy; 03-11-2009 at 04:31 AM.
 
Old 03-12-2009, 02:57 AM   #4
vinayakm
LQ Newbie
 
Registered: Mar 2009
Posts: 4

Original Poster
Rep: Reputation: 0
Hi Friends

Thanks for the replies. @eco suggestion taken regarding variables. The reason why I am parsing the url from a file is that once a file is downloaded completely from a url specified in the file then the url is removed. So at any point of time I will always have an updated list of what is remaining to be downloaded. The reason why i am not using directory-prefix is that sometimes i terminate the download midway so the download is not complete and whatever file was downloaded partially will not work. What functionality I wanted was that once a file is completely downloaded then the file should be moved to some other directory from where I can copy it to another machine. What i want to shorten in this script is that for extracting the filename alone i have to use a for loop instead of that is there any other easier way to do it

Thanks and Regards
Vinayakm
 
Old 03-12-2009, 05:46 PM   #5
weibullguy
ReliaFree Maintainer
 
Registered: Aug 2004
Location: Kalamazoo, Michigan
Distribution: Slackware-current, Cross Linux from Scratch, Gentoo
Posts: 2,705
Blog Entries: 1

Rep: Reputation: 220Reputation: 220Reputation: 220
You could use the --timestamp option with wget to only download files in your list if they are newer on the server. The --continue option will continue the download of files that are interrupted. Using those options, you could do something like this
Code:
#!/bin/sh

# What directory is the final storage location 
# for the downloaded files?
permdir="/downloads/test"

# Get 'em, baby.
wget -i ~/downloads/url.txt --continue --timestamp --directory-prefix=$permdir

exit 0
If you want to use looping, you could do something like this.
Code:
#!/bin/sh

# What directory is the final storage location 
# for the downloaded files?
permdir="/downloads/test"

# How many lines are in the file of tarballs?
n=`awk '//{n++}; END {print n+0}' url.txt`

# Initialize the count.
i=0

# Do it, baby.
while [ $i -lt $n ]
do
	
	# Get the first line (i.e., first file to download),
	# then download it using wget.
	url=`head -n1 url.txt`
	wget -c --http-user=xxx --http-password=abc $url

	# Find the name of the file that was just downloaded.
	file=`basename $url`

	# Move the recently downloaded file to another directory.
	mv -v $file $permdir

	# Remove the line for the file just downloaded.
	sed -si 1d url.txt

	# Increment the count.
	((i+=1))
	
done
 
Old 03-12-2009, 11:24 PM   #6
vinayakm
LQ Newbie
 
Registered: Mar 2009
Posts: 4

Original Poster
Rep: Reputation: 0
Hi Friends

Thanks a lot for the help

Thanks and Regards
Vinayakm
 
  


Reply

Tags
shell script


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
How to ssh from a shell script ? For ppl who can write shell scripts. thefountainhead100 Programming 14 10-22-2008 06:24 AM
help with execute mulitple shell script within shell script ufmale Programming 6 09-13-2008 12:21 AM
I made a shortcut to a shell script and it is using default shell icon... shlinux Linux - Software 2 04-20-2006 06:29 AM
shell script problem, want to use shell script auto update IP~! singying304 Programming 4 11-29-2005 05:32 PM


All times are GMT -5. The time now is 08:54 AM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration