read, write syncing
Hi all,
I have the following scenario. 1. Script A copies a file to local file system directory local_dir from an SSHFS share. 2. Process B periodically check for files in local_dir, If any files found, reads the data on it and insert it to DB, then deletes the file. The question is, Am I gonna face synchronization issues? For example: if the ssh connection fails, would the copy fails as a whole or only a portion of the file will be copied? or the read Process reached the end file while the copying process is not done yet. And is there a way to safely doing the upper scenario without any data loss?? |
Off the top of my head, you could create a hash of the original file on the share, copy the file to local_temp_dir, check the hashes match (ie. that it's copied correctly), and then copy them to local_dir.
|
Quote:
Added to your suggestion: . I will create another dir called hashes_dir . On the ssh server create a hash file for each file i.e. filex.hash that contains the hash value . On the client I copy the hash to the hashes_dir, copy the file to local_dir, when copy is done I check the hash and if it's correct I delete the hash file from hashes_dir. . Process B checks the local_dir, and when it finds a file on it, it checks for it's hash file in the hashes_dir, if it's not there so I can read it safely. euhmmm, now time to implement that, cheers. :-D |
All times are GMT -5. The time now is 03:37 AM. |