LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


Reply
  Search this Thread
Old 04-26-2009, 06:45 PM   #1
legacyprog
Member
 
Registered: Feb 2004
Location: New Jersey, USA
Distribution: Ubuntu 14.04
Posts: 55

Rep: Reputation: 15
Question root cron script using cp commands doesn't copy all files


I suspect I'm missing something obvious but I'm stumped. The following script, called SystemBackup

#!/bin/bash
cp -purL /home /archive5/LinuxBackup/Tree
cp -purL /etc /archive5/LinuxBackup/Tree
cp -purL /var /archive5/LinuxBackup/Tree
cp -purL /root /archive5/LinuxBackup/Tree
rm -d -r /archive5/LinuxBackup/Tree/var/cache/beagle/indexes

when run under Ubuntu 8.04 as

sudo ./SystemBackup

seems to copy all the files in the source directories to my backup area on an external hard drive (directory "Tree" having been emptied before hand)

but when run as a root crontab entry (again having emptied "Tree" first)

50 16 * * * /home/user/SystemBackup

it copied some of the sub-folders under home, etc, var, root but not all (maybe only a third). I don't see any consistency vis permissions or ownership between what it does and does not copy.

I also modified the cp -purL /var to cp-vpurL /var and piped the output to my /home/user folder. It just seems to only be copying some of the files. I don't see any error messages.

I've googled but similar situations seem to be based on not running as root, but I checked that by substituting /usr/bin/id for ./SystemBackup in the crontab entry and piping the output to my /home/user folder. It showed that I was running as root.

I know I'm going to kick myself, but I just don't have a handle on this (and at best I'm intermediate Linux skills, more likely seasoned newbie). Any ideas? Need more details?
 
Old 04-26-2009, 07:14 PM   #2
aus9
LQ 5k Club
 
Registered: Oct 2003
Location: Western Australia
Distribution: Icewm
Posts: 5,842

Rep: Reputation: Disabled
hi

I am more concerned that the name of the script is called systembackup. Have you tested this assuming your system files are lost or attempted a restore on a new hard drive? Me thinks you do not have enough to get back to where you want?

2) so can you advise what normal backup software you have tried and why they do not meet your requirements. And while I am being nosy.....have you copied stuff to tape drive or external drive or dvdrws?

3) I am not perfect btw....I use partimage and the image is backed up to dvdrw Its not an incremental backup so you need to take some risks and backup key files separate to partition....but I am more confident I can rebuild onto a new hard drive....as I have done so in the past.

4) others like dd if they do not use dedicated backup software

5) if you like scripts.....instead of software...take a peek at this ...it could suit ?

http://smxi.org/site/install.htm#rbxi



regards

Last edited by aus9; 04-26-2009 at 07:25 PM. Reason: add rxbi
 
Old 04-26-2009, 07:34 PM   #3
legacyprog
Member
 
Registered: Feb 2004
Location: New Jersey, USA
Distribution: Ubuntu 14.04
Posts: 55

Original Poster
Rep: Reputation: 15
Hi aus9, thanks for the reply.

Well, when I was on Suse I used its backup which created a tarball. Back in Jan 08 when I switched to Kubuntu I posted here at LQ asking for "best practice" since it didn't seem to have an equivalent backup. See the thread: http://www.linuxquestions.org/questi...actice-611349/

From advice received in that thread I created the script I posted which *seems* to have worked fine for over a year. Note that my data files are on separate partitions and are handled differently. This "SystemBackup" is only meant to provide a way to get back the state of Linux itself and/or to provide a reference point if I mess up some config file. Granted, I haven't had to recover, nor have I had a fire drill to simulate same.

After posting this question I realized I could have stepped outside the box and changed to use rsync. The cp command looked like it should be fine for my simple needs.

I'm not sure if it was my recent switch to Ubuntu 8.04, or changing the destination to a USB external drive instead of a Samba share, or maybe my script has never *really* been working (in the sense that the "u" switch and my not deleting the destination as a matter of course could mean it stopped working months ago and I didn't realize it until "testing" my new script to a new, empty destination).

Last edited by legacyprog; 04-26-2009 at 07:36 PM. Reason: Clarification
 
Old 04-26-2009, 07:34 PM   #4
jhwilliams
Senior Member
 
Registered: Apr 2007
Location: Portland, OR
Distribution: Debian, Android, LFS
Posts: 1,168

Rep: Reputation: 211Reputation: 211Reputation: 211
-- woops nothing to see here --

Last edited by jhwilliams; 04-26-2009 at 07:36 PM. Reason: woops
 
Old 04-26-2009, 07:41 PM   #5
GlennsPref
Senior Member
 
Registered: Apr 2004
Location: Brisbane, Australia
Distribution: Devuan
Posts: 3,655
Blog Entries: 33

Rep: Reputation: 283Reputation: 283Reputation: 283
Hi, I would look into rsync for backups rather than cp.

rsync won't care if the file is protected or not.

http://www.solo-technology.com/blog/...un-with-rsync/

http://samba.anu.edu.au/rsync/

http://www.codejacked.com/an-introdu...-rsync-part-2/

My original reference is gone, here is a plain text copy, check the man page for rsync for more...
Code:
ref. http://blog.lxpages.com/ (gone)
Apr17
Fun With Rsync - Part I

April 17, 2008 | 15 Comments

If you’ve used rsync in the past, you know that it makes quite a few things much easier to manage.
For example, you can use rsync to backup files from one server to another, you can use it to
automate patch deployments, you can mirror entire websites and also have it function like a
semi-NFS client/server which can synchronize data between several servers by running it like a
watchdog so that as soon as a file gets updated on primary server, it instantly syncs up all
other servers.

What is Rsync?
rsync is a file transfer program for Unix systems. rsync uses the “rsync algorithm” which provides a
very fast method for bringing remote files into sync. It does this by sending just the differences in
the files across the link, without requiring that both sets of files are present at one of the ends of
the link beforehand. Some features of rsync include

   1. Can update whole directory trees and filesystems
   2. Optionally preserves symbolic links, hard links, file ownership, permissions, devices and times
   3. Requires no special privileges to install
   4. Internal pipelining reduces latency for multiple files
   5. Can use rsh, ssh or direct sockets as the transport
   6. Supports anonymous rsync which is ideal for mirroring

The Basics

If you don’t already have it installed, get it done this way on Debian systems:

    apt-get install rsync

On other Linux distributions you would use yum(Fedora/CentOS) or yast (SuSE) to install rsync.

The base requirement of the rsync system is that you install the rsync program on the source and
destination hosts. The easiest way to transfer files is to use a remote shell account.
To copy a group of files to your home directory on host, you can run this command:

    rsync file1 file2 … host:

Rsync defaults to rsh as the remote shell and that is not secure. Instead you can have rsync use
ssh by using the option –rsh or -e ssh option:

    rsync -e ssh file1 file2 … host:destination_dir

Copying one folder to another:

    rsync -r /home/lxpages/junk /home/backups/junk/

That will copy the folder with simple application, without that much are to preservation of
permissions/owners/etc. To make exact
duplicates, we can add the -a switch.

    rsync -av /home/lxpages/junk /home/backups/junk

The above retains ALL ownership/permission and rsync in verbose mode.

    rsync -av --delete /var/www/junk /home/lxpages/junk

The –delete keeps destination folder identical to source by removing any files that don’t match src
dir.

    rsync -av --delete -e ssh

    lxpages@remotehost.com:/home/lxpages/junk /home/lxpages/junk

Rsync everything from remote server and folder /home/lxpages/junk to local server and folder
/home/lxpages/junk.

There are many more examples which I will include later parts. For now just remember that man page is
your friend. Simply type ‘man rsync’ and read/view all the available options you can play around with.

References: http://samba.anu.edu.au/rsync/


rsync -av /media/cdrom/win_c /mnt/win_c
cheers Glenn
 
Old 04-26-2009, 08:59 PM   #6
legacyprog
Member
 
Registered: Feb 2004
Location: New Jersey, USA
Distribution: Ubuntu 14.04
Posts: 55

Original Poster
Rep: Reputation: 15
Thanks Glenn. I was away from home, just saw your advice. I will work on converting my simple script from cp to rsync and then retest. Thanks for the "Fun with rsync" doc, plus (as you say) I'll look at the man page also.

---
*** EDIT ***
I have now changed the original script where it said "cp -purL" to have "rsync -a --delete" and rerun both the sudo version from the command line, and the same script as a root cron entry, deleting the output folder first so they both started clean. Both give the correct results (i.e., number of files and total size of the four source folders and their backed-up versions match). Thanks for the advice and for helping me to learn!

Last edited by legacyprog; 04-26-2009 at 09:55 PM. Reason: Confirmed successful test
 
Old 04-27-2009, 02:38 AM   #7
GlennsPref
Senior Member
 
Registered: Apr 2004
Location: Brisbane, Australia
Distribution: Devuan
Posts: 3,655
Blog Entries: 33

Rep: Reputation: 283Reputation: 283Reputation: 283
No problems,

The next time you run it it should only update files that have changed,

like a back program would.

The nice part is that it will write to ntfs too and keeping permissions is handy.

Some one told me about rsync, I'm just passing it on. That's community.

Cheers Glenn
 
  


Reply

Tags
bash, cp, cron



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
can't copy files from root rbees Linux - Newbie 12 02-22-2008 04:22 PM
Need script for cron to erase MT files DavidHB Linux - Software 4 04-30-2007 06:51 AM
Run bash commands script as cron anjanesh Linux - General 4 04-03-2007 06:25 AM
How to copy files from an entire directory tree using terminal commands??? lusius188 Linux - General 3 04-10-2006 11:23 AM
Copy files with cron greg334 Linux - Newbie 1 02-21-2005 10:13 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 02:22 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration