Best way to copy all files from server for backup before decommission
they have an old server that they're getting rid of, but they just want to make a copy of all the files off of it, as a "just in case". Not exactly knowing where everything might be tucked away on there, what's probably the best way to do that? make a .tgz of / ? can you even do that from the local box and have it write to that, or will that try to tgz itself and cause chaos?
I know, kind of a silly question and not great that they don't know where anything they might need is (which probably means they don't need anything haha) |
What stops you from tar.gz ing everything to an external hard drive?
Alternatively you could clone the drive with e.g. clonezilla. (Never used it myself but heard good things of it). |
How much data? Is this a physical or virtual server?
|
I'm not at the same physical location to plug a drive in. Right now all I have is remote connection to the machine. But eventually the data will have to go to an external drive, or to another machine, was just wondering if there was anything I could do in the meantime till I was in the same location.
---------- Post added 09-08-14 at 04:02 PM ---------- It's an old physical server, so like 40GB of data I think. The server has more than double that in free space. |
Well in that case tar gz / to a tarball in /tmp and exlude /tmp. Then you are sure it doesn't get confused.
When you get there you'll only have to pull the tarball. If the machine is at risk to be rebooted in the meanwhile move the tarball some place else after taring it to protect it from being deleted. |
ah, that's a good idea to exclude the dir i'm writing to, never thought of that, will have to look up how to do it. that's the kind of solution I was looking for, something that I could let do the work while i'm away then I can just do a big copy once I get there. Thanks!
|
Usually I do something along the lines of:
Code:
rsync -aAXv --delete /* /backupdir --exclude={/dev/*,/proc/*,/sys/*,/tmp/*,/run/*,/mnt/*,/media/*,/lost+found,/.gvfs,/backupdir/*} But then again I like to have working copies of the backups rather than gigantic tarballs. |
No, tar is not a good option here. You want to have an archive just in case. What happens if you want to see if that old file is there? You have to untar everything before you can look at the file. The best option here is rsync:
Code:
rsync -auv --exclude='/proc' --exclude='/sys' --exclude='/dev' / /path/to/destination/ jlinkels |
Quote:
* Mount the backup device on, say, /mnt/archive. * Create a file to hold a list of things you want to NOT archive, say "/tmp/ignore.lis": Code:
^/mnt/archive * Then issue: Code:
find / -depth -print0 | grep -v -f /tmp/ignore.lis | cpio --null -pVd /mnt/archive There's more than one way to skin this cat. Later... |
Quote:
To the OP: One advice I'd like to add though: When you pull the tarball to an external drive when you get there you may want to do a md5checksum of source tarball vs. destination tarball afterwards to make sure your copy is good. Using rsync for the copying of the tarball would also be a good idea as it should have built-in functionality to check the copied files directly (not sure which flag that was). |
Hi,
Quote:
Code:
tar tf foo.tar.gz Code:
tar xf foo.tar.gz some/file/bar.baz |
You need the 'z' switch. Without it you'd need to use:
Code:
gzip -dc foo.tar.gz | tar xf - some/file/bar.baz |
Quote:
From https://www.gnu.org/software/tar/man...node/gzip.html Quote:
Evo2. |
Wow, this post has really taken off, with LOTS of good ideas. So many ways to accomplish this, I'm actually not feeling as silly that I made my OP now. Thanks everybody!
|
Quote:
|
All times are GMT -5. The time now is 04:56 AM. |