I hope someone may be able to help me resolve a problem I have been having.
I recently set up an automated shell script (bash on Ubuntu 8.10) to download new files from a server using ftp. Unfortunately the other end of the link is not terribly stable (and there is nothing I can do about this) which has resulted in the script hanging sometimes and then being kicked off again at the time set in the crontab.
This has resulted in multiple hung sessions taking up all the system resources.
The offending section of code is given below.
for file in $(echo $filelist)
if [-f $file]
echo $file already exists: skipping
echo opening connection
/usr/bin/ftp -ni $R_FTP <<-EOS 1>>tfr.log 2>>tfr.err
quote USER $USR
quote PASS $PSWD
get $file .$file
echo $file retreived
mv .$file $file
I'd like to know if there is a way I can force an exit if the connection hangs or alternatively if something like the Perl Net::FTP can handle these sorts of errors internally.
I really appreciate any advice!