LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Software (https://www.linuxquestions.org/questions/linux-software-2/)
-   -   Automated Download Problem - shell ftp vs Perl (https://www.linuxquestions.org/questions/linux-software-2/automated-download-problem-shell-ftp-vs-perl-729836/)

emmalg 06-01-2009 07:42 AM

Automated Download Problem - shell ftp vs Perl
 
Hi All

I hope someone may be able to help me resolve a problem I have been having.

I recently set up an automated shell script (bash on Ubuntu 8.10) to download new files from a server using ftp. Unfortunately the other end of the link is not terribly stable (and there is nothing I can do about this) which has resulted in the script hanging sometimes and then being kicked off again at the time set in the crontab.

This has resulted in multiple hung sessions taking up all the system resources.

The offending section of code is given below.

Code:

for file in $(echo $filelist)
do
  if [-f $file]
    then
    echo $file already exists: skipping
  else
    echo opening connection
    /usr/bin/ftp -ni $R_FTP <<-EOS 1>>tfr.log 2>>tfr.err
    quote USER $USR
    quote PASS $PSWD
    binary
    get $file .$file
    close
    bye
    EOS
    echo $file retreived
    mv .$file $file
  fi
done

I'd like to know if there is a way I can force an exit if the connection hangs or alternatively if something like the Perl Net::FTP can handle these sorts of errors internally.

I really appreciate any advice! :)

stevexyz 06-01-2009 08:14 AM

Net::FTP does have a read and write timeout, but you might like to try wget which seems to do everything!

Steve

emmalg 06-01-2009 09:40 AM

Cheers Steve! Funnily enough I've actually spent most of my time since posting this working on an alternative using wget, I'm glad to know (fingers crossed) that it should have solved the time out issues!

Emma

H_TeXMeX_H 06-01-2009 12:34 PM

I think lftp has a timeout setting as well.


All times are GMT -5. The time now is 08:48 PM.