LinuxQuestions.org
Latest LQ Deal: Linux Power User Bundle
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 05-09-2011, 02:35 PM   #1
zaeem
Member
 
Registered: Jan 2010
Posts: 118

Rep: Reputation: 15
Shell script to FTP files


Dear All,

I need to write a shell script which can ready content of the folder and place files on remote FTP server. I need to make sure that a file that is already placed on remote FTP server is not attempted second time. The file names will be something like Records-2011-05-09. The files will be generated by MySQL every hour. Can anybody please let me know how can I do that?

Any help is appreciated.
 
Old 05-09-2011, 02:44 PM   #2
acid_kewpie
Moderator
 
Registered: Jun 2001
Location: UK
Distribution: Gentoo, RHEL, Fedora, Centos
Posts: 43,417

Rep: Reputation: 1974Reputation: 1974Reputation: 1974Reputation: 1974Reputation: 1974Reputation: 1974Reputation: 1974Reputation: 1974Reputation: 1974Reputation: 1974Reputation: 1974
If we are free to reinterpret what your requirements appear to be, you would probably want to look at something like rsync to keep two directories up to date. This works very well over ssh, and is probably a better option than using ftp if that's viable.
 
Old 05-09-2011, 02:54 PM   #3
zaeem
Member
 
Registered: Jan 2010
Posts: 118

Original Poster
Rep: Reputation: 15
Dear acid_kewpie,

Thanks for your reply. If I use rsync and delete some data in one directory then same data will be removed from other directory as well. Isn't it? I prefer to use FTP as I got FTP access to the server where I need to upload files and must make sure that the files uploaded in last attempt will not be uploaded again.

Please help me achieve this.
 
Old 05-09-2011, 04:26 PM   #4
wpeckham
Senior Member
 
Registered: Apr 2010
Location: USA
Distribution: Debian, Ubuntu, Fedora, RedHat, DSL, Puppy, CentOS, Knoppix, Mint-DE, Sparky, Vsido, tinycore, Q4OS
Posts: 1,657

Rep: Reputation: 584Reputation: 584Reputation: 584Reputation: 584Reputation: 584Reputation: 584
rsync

rsync has interesting options, some of them to control what files are transferred (only ones missing in the source folder, would sound like your usage) and when or if to 'prune' files from the target folder that have been deleted from the source.

You need to examine the man page and select the behavior that conforms to your need.

If you were UPDATING files, rsync would have a clear advantage over any other utility. Since you are only transferring files, something like sftp or lftp with the proper parameters MAY serve as well.

The key here is to research and use a utility that you can tune to conform to your need, not to try to use a script to 'reinvent the wheel' (or the GNU)!
 
Old 05-09-2011, 07:34 PM   #5
TB0ne
LQ Guru
 
Registered: Jul 2003
Location: Birmingham, Alabama
Distribution: SuSE, RedHat, Slack,CentOS
Posts: 17,945

Rep: Reputation: 3693Reputation: 3693Reputation: 3693Reputation: 3693Reputation: 3693Reputation: 3693Reputation: 3693Reputation: 3693Reputation: 3693Reputation: 3693Reputation: 3693
Quote:
Originally Posted by zaeem View Post
Dear acid_kewpie,

Thanks for your reply. If I use rsync and delete some data in one directory then same data will be removed from other directory as well. Isn't it? I prefer to use FTP as I got FTP access to the server where I need to upload files and must make sure that the files uploaded in last attempt will not be uploaded again.

Please help me achieve this.
Well, rsync is the way to go, really. But if you need a shell script, then by all means, write one. There are thousands of bash scripting tutorials, and many FTP scripts that are already written, and easily modified. If you need HELP, post what you've written, and where you're stuck. If you're expecting us to write it FOR YOU....no, sorry.

You can start by reading the man pages for expect, ftp and date. You could use scp, instead, and not only be more secure, but (with passwordless login/key exchange), make scripting FAR easier.
 
Old 05-11-2011, 10:32 AM   #6
zaeem
Member
 
Registered: Jan 2010
Posts: 118

Original Poster
Rep: Reputation: 15
Dear TB0ne,

I have written following script which is working fine if I run as sh script.sh but when I schedule it through cron it doesn't upload the files to FTP server however I can see that in /var/log/cron scrip runs. Please help me to resolve this issue.

#!/bin/bash
movedFile=`find . -name '*.csv' -cmin -60`

#echo "Uloading file $i ...."
HOST='192.168.0.10'
USER='test'
PASSWD='test123'

for i in $movedFile; do

/usr/bin/ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
binary
cd Records
put $i
quit
END_SCRIPT
mv $i /root/backup/
done
 
Old 05-11-2011, 11:42 AM   #7
TB0ne
LQ Guru
 
Registered: Jul 2003
Location: Birmingham, Alabama
Distribution: SuSE, RedHat, Slack,CentOS
Posts: 17,945

Rep: Reputation: 3693Reputation: 3693Reputation: 3693Reputation: 3693Reputation: 3693Reputation: 3693Reputation: 3693Reputation: 3693Reputation: 3693Reputation: 3693Reputation: 3693
Quote:
Originally Posted by zaeem View Post
Dear TB0ne,
I have written following script which is working fine if I run as sh script.sh but when I schedule it through cron it doesn't upload the files to FTP server however I can see that in /var/log/cron scrip runs. Please help me to resolve this issue.

Code:
#!/bin/bash
movedFile=`find . -name '*.csv' -cmin -60`

#echo "Uloading file $i ...."
HOST='192.168.0.10'
USER='test'
PASSWD='test123'

for i in $movedFile; do
        
        /usr/bin/ftp -n $HOST <<END_SCRIPT
        quote USER $USER
        quote PASS $PASSWD
        binary
        cd Records
        put $i
        quit
END_SCRIPT
mv $i /root/backup/
done
What does the log SAY?? What error(s) do you get?? What messages?

Debug it...uncomment the echo statement, and you may also want to put a real path into the script, instead of just "." Specify it, so the script knows which directory to start from. If all your .csv files are in /home/some/path/xxxx, then start with /home/some/path instead of ".". Otherwise, when the script starts, it's only going to check the directory where the script IS.
 
Old 05-12-2011, 12:06 PM   #8
wpeckham
Senior Member
 
Registered: Apr 2010
Location: USA
Distribution: Debian, Ubuntu, Fedora, RedHat, DSL, Puppy, CentOS, Knoppix, Mint-DE, Sparky, Vsido, tinycore, Q4OS
Posts: 1,657

Rep: Reputation: 584Reputation: 584Reputation: 584Reputation: 584Reputation: 584Reputation: 584
script

What does the crontab entry look like?

Have you made this script executable, what are its permissions?
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Issue with ftp (Windows NT) via SHELL SCRIPT to get files. MeenakshiPrasanna Linux - Newbie 10 11-19-2010 02:08 PM
[SOLVED] shell script using wget to download files from ftp, sub directories bayaraa_u Linux - General 1 04-28-2010 03:50 AM
shell script - auto ftp files that older than 60 days bakeng Linux - Newbie 2 12-25-2007 11:35 PM
FTP files to remote server with a shell script Pezzoni Linux - Software 2 06-27-2007 08:01 AM
Running FTP inside Shell Script truncates files jbhanc0125 Red Hat 3 01-16-2004 12:37 PM


All times are GMT -5. The time now is 04:28 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration