Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Hello I recently made a shell script to connect to a remote server and download the files to my computer and/or put files on the remote server. Does anyone know how i can make the script so it would automatically syncronize the two folders?
To be honest, I'd probably just use rsync over SSH to do this. It would be easier and more secure than FTP.
Assuming you want to reinvent the wheel, though, you'd need your script to check the mtime of all the files. Anything on the source that was modified after the corresponding file on the destination would be re-uploaded to keep the files in sync. Files existing in the destination but not the source would be deleted. Of course, files that exist on the source but not on the destination would also have to be uploaded. These tasks could be written as a simple directory search. You'd have to make a list of all the files and their mtimes on the source and destination (possibly painful just going over FTP), and then run a script to compare them and make a list of uploads/deletions, and then process it through your script.
Judging by your reply, I doubt that you'll proceed with FTP and use rsync instead.
Using FTP, an easy solution would be to use wget on both machines. Downside would be that you'll need a ftp daemon running on both machines.
wget can take into account last modification times (-N option), recurse through directory trees (-r, -l),
can continue when it was abruptly halted (-c), etc.
instead of syncronizing how would i go about downloading the files and then comparing the files i downloaded from the remote server to those files i have in a folder on my local drive. If the folders are not equal it would then write to a file sayin so..any ideas?
You could use a command like "diff" (for text files) or "cmp" (for binary & text files) to compare files.
diff reports the result, whereas cmp's result must be checked by the return code ($? parameter in Bash, just after cmp call).
An alternative, often used in distribution of software packages, is the use of checksums. MD5 checksums are most popular here, I'd say. Check out the tool "md5sum".
Personnally, I prefer checksums, certainly in cases where the files you need to compare are on different machines. Sharing (or copying) the small checksum result files between machines is easier and faster than copying around the (sometimes very big) files you need to compare.
Some tools limit their checks to -for instance- file size & last modification dates. But for my applications, neither is considered reliable enough.
I was actually looking into using diffs. The files will all be on the same machine. The files from the remote host will be downloaded to a folder on the local machine and then compared to another folder located on the remote machine. I have my ftp script that connects and downloads the files should the script that compares the folders be in a seperate script and can somebody get me started i am fairly new to scripting.
Ok i seem to have the diff script working on its own and it records the differences to a file. How would i go about combining my ftp script and my diff script because if i end the ftp script it closes out the script entirely.
now if I wanted to get this to work using cmp I would have to make a loop of some sort and compare each file individually correct? I would i go about comparing the files as they are downloaded to a file in the directory on the local system
now if I wanted to get this to work using cmp I would have to make a loop of some sort and compare each file individually correct?
Indeed.
Assuming the files have the same name and the directory structure is similar, you could use "find dir -type f" to track down all the files (recursively) and then execute the cmp for each file found.
In other words, something like:
Code:
cd your_download_dir;
files=`find . -type f`;
for file in $files; do
cmp $file your_local_dir/$file
if (( $? != 0 )); then
echo "cmp says files $file are different"
else
echo "cmp says files $file are identical"
fi;
done;
Please pay attention to the paths.
Try the "find" command alone for starters to get an idea of what it returns and how you need to modify this for your local directory.
In the example, you'll need to:
-replace the echo's with more useful commands
-replace "your_local_dir" and "your_download_dir" with the paths where the files are stored. Both dirs must have the same structure for the above script to work "as is".
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.