Automated Download Problem - shell ftp vs Perl
Hi All
I hope someone may be able to help me resolve a problem I have been having. I recently set up an automated shell script (bash on Ubuntu 8.10) to download new files from a server using ftp. Unfortunately the other end of the link is not terribly stable (and there is nothing I can do about this) which has resulted in the script hanging sometimes and then being kicked off again at the time set in the crontab. This has resulted in multiple hung sessions taking up all the system resources. The offending section of code is given below. Code:
for file in $(echo $filelist) I really appreciate any advice! :) |
Net::FTP does have a read and write timeout, but you might like to try wget which seems to do everything!
Steve |
Cheers Steve! Funnily enough I've actually spent most of my time since posting this working on an alternative using wget, I'm glad to know (fingers crossed) that it should have solved the time out issues!
Emma |
I think lftp has a timeout setting as well.
|
All times are GMT -5. The time now is 08:48 PM. |