Help answer threads with 0 replies.
Go Back > Forums > Linux Forums > Linux - General
User Name
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.


  Search this Thread
Old 08-15-2013, 04:49 PM   #1
LQ Newbie
Registered: Apr 2013
Posts: 7

Rep: Reputation: Disabled
How do I write script to backup

On ubuntu 13.04

I have a directory say "mydata". This is where I keep the tree of original sensitive documents, most of them in pdf format.

I have setup a cloud backup service. All I need to do is copy document tree to this directory and it will get uploaded. I don't want to use my original directory as documents are not encrypted in the original directory. Unfortunately service does not allow me to setup exclusion patterns. Everything in the backup directory gets uploaded.

What I would like to do copy the tree to destination folder, encrypt all the files and remove original files.

In general I am somewhat able to make this work using a shell script, but not sure if its most efficient way. I can see following problems

1. Encrypting full tree every day even if only one file is changed
2. Don't know if GPG encrypt outputs exactly same with same parameters (file, algorithm, password etc.) If not, I'll be uploading lot of data unnecessarily and not using sync function of the service.

I know I can use find command to find files modified, and md5sum to figure out if the file actually changed or not. But not able to put it together so that directory tree is maintained, only modified encrypted files get uploaded etc.

Any help is greatly appreciated.
Old 08-16-2013, 02:14 AM   #2
Registered: Aug 2010
Location: Netherlands
Distribution: Kubuntu, Debian, Suse, Slackware
Posts: 317

Rep: Reputation: 73
I am not sure if you used the word sensitive as in data that should not be seen by other. If that is the case I would think twice about storing it in the cloud. Especially with all the news at the moment, that most ISPs have no problem handing over customer data to the government if they ask for it.
But if you insist, they simplest way would be to encrypt the files to a temporary directory, and then moving them to the cloud backup directory.
I am also not sure what you meant with, removing the original files. Did you mean the original pdf files or the encrypted ones.

Personally I would just buy some usb drives and rsync the data to them. If you use 2 you can rotate them on a regular base and store one on a secure place. Lets say a bank's safety deposit box.

Old 08-16-2013, 07:23 PM   #3
Senior Member
Registered: Dec 2005
Location: Florida
Distribution: CentOS/Fedora
Posts: 2,630

Rep: Reputation: 495Reputation: 495Reputation: 495Reputation: 495Reputation: 495
i agree 100% with dgejonge with not storing any kind of "sensitive" data in a Cloud unless it is YOUR hardware that you can lock down.

I dont know now much data you have to move/store but you might consider an encrypted tarbal that is moved to the cloud that i would also p/w protect the tarbal. that way it is double protected. the more layers you add to getting access to the data, the more it will cost the bad guys, that includes the NSA, from gaining access to the data inside. the more it cost, the less likely they are to "accidentally" gain access to your data.

a simple bash script to gather the data you with to tar, then verify the tar, then encrypt the tar, then rsync the tar to your backup directory and have only the files uploaded to the cloud.

yes bash is the easy way to handle this.
Old 08-16-2013, 08:44 PM   #4
Darth Maul
LQ Newbie
Registered: Aug 2013
Distribution: PClinuxOS 2013.7 KDE
Posts: 28

Rep: Reputation: Disabled
I don't trust cloud storage either even if they proclaim your data is private and secure. I don't mean to doubt their sincerity.

I personally use an encrypted external hard drive and rsync.

If you want to use an encrypted tarball of your directory. Here are some example codes.

NOTE: Before using these codes, I suggest you practice them on a test directory with files you have copies of or junk files. Test both encrypting and decrypting your tar file.

Last edited by Darth Maul; 08-16-2013 at 08:53 PM.
Old 08-17-2013, 01:12 AM   #5
LQ Newbie
Registered: Apr 2013
Posts: 7

Original Poster
Rep: Reputation: Disabled
Thank you all for the suggestions. I did post another reply/clarification this morning - about 12 hours back, but looks like it did not go through.

I don't trust those services either, hence plan to encrypt the data. I am not too much worried about government snooping as data is mostly financial records and such which they are already aware of. My biggest worry is identity thieves and hackers who manage to penetrate cloud security and gain access to data.

I need to access data remotely hence the need for cloud based solution. My normal environment is Ubuntu either at work or on my laptop if I am travelling. So gpg is always available.

Based on suggestions here, I can run gpg multiple times encrypt same file with different passwords. Do you think this will make difference? Tar suggestion, I had tried earlier, but faced two problems. First one is I need to figure out which archive has my file and other one because of upload size limitations, a file can be potentially splitted into multiple archives. I had some elaborate code in Java that was doing the calculations and creating tar files, it did work as far backup was concerned. But remote access was pain as it was never easy to find out which archive.
Old 08-17-2013, 12:45 PM   #6
Senior Member
Registered: Dec 2005
Location: Florida
Distribution: CentOS/Fedora
Posts: 2,630

Rep: Reputation: 495Reputation: 495Reputation: 495Reputation: 495Reputation: 495
as you are running Linux at work and while on the road, im guessing while at home too, then there is no reason for cloud access. just ssh into either home or work to get access to the data directly. if you need to move it local to work on it while on the road, then either scp or rsync via ssh and the data will remain safe and secure during transit.

if you are unable to access your home data via the WWW due to a DHCP IP that rolls, then you might consider looking into some type of DDNS, Dynamic Domain Name Service, like noip or freedns to gain access to your home. it is just a simple matter of port forwarding via your router at the house at that point to gain access to the data.
Old 08-18-2013, 06:43 PM   #7
LQ Newbie
Registered: Apr 2013
Posts: 7

Original Poster
Rep: Reputation: Disabled
Thank you. I see what you are saying. I'll consider ssh route. Did not think about it earlier, but I have ddns system also setup.


backup, find, rsync, scripting

Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
i want to write a shell script create backup of my system into a external hard disk. rajhans Linux - Newbie 6 03-30-2012 04:18 PM
Newbie trying to write a simple backup script to backup a single folder Nd for school stryker759a Linux - Newbie 2 09-16-2009 08:52 AM
Howto write a backup script on linux jeitjes Programming 1 06-20-2009 08:48 AM
how to write a backup script? joseph2020 Ubuntu 12 06-15-2009 01:36 PM
Backup script to write to file? ctroyp Linux - Newbie 13 01-10-2006 06:15 PM > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 04:30 PM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration