LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 09-08-2014, 03:49 PM   #1
rjo98
Senior Member
 
Registered: Jun 2009
Location: US
Distribution: RHEL, CentOS
Posts: 1,668

Rep: Reputation: 46
Best way to copy all files from server for backup before decommission


they have an old server that they're getting rid of, but they just want to make a copy of all the files off of it, as a "just in case". Not exactly knowing where everything might be tucked away on there, what's probably the best way to do that? make a .tgz of / ? can you even do that from the local box and have it write to that, or will that try to tgz itself and cause chaos?

I know, kind of a silly question and not great that they don't know where anything they might need is (which probably means they don't need anything haha)
 
Old 09-08-2014, 03:55 PM   #2
joe_2000
Member
 
Registered: Jul 2012
Location: Aachen, Germany
Distribution: Void, Debian
Posts: 808

Rep: Reputation: 216Reputation: 216Reputation: 216
What stops you from tar.gz ing everything to an external hard drive?
Alternatively you could clone the drive with e.g. clonezilla. (Never used it myself but heard good things of it).
 
1 members found this post helpful.
Old 09-08-2014, 04:01 PM   #3
thesnow
Member
 
Registered: Nov 2010
Location: Minneapolis, MN
Distribution: Ubuntu, Red Hat, Mint
Posts: 170

Rep: Reputation: 56
How much data? Is this a physical or virtual server?
 
1 members found this post helpful.
Old 09-08-2014, 04:01 PM   #4
rjo98
Senior Member
 
Registered: Jun 2009
Location: US
Distribution: RHEL, CentOS
Posts: 1,668

Original Poster
Rep: Reputation: 46
I'm not at the same physical location to plug a drive in. Right now all I have is remote connection to the machine. But eventually the data will have to go to an external drive, or to another machine, was just wondering if there was anything I could do in the meantime till I was in the same location.

---------- Post added 09-08-14 at 04:02 PM ----------

It's an old physical server, so like 40GB of data I think. The server has more than double that in free space.
 
Old 09-08-2014, 04:07 PM   #5
joe_2000
Member
 
Registered: Jul 2012
Location: Aachen, Germany
Distribution: Void, Debian
Posts: 808

Rep: Reputation: 216Reputation: 216Reputation: 216
Well in that case tar gz / to a tarball in /tmp and exlude /tmp. Then you are sure it doesn't get confused.
When you get there you'll only have to pull the tarball. If the machine is at risk to be rebooted in the meanwhile move the tarball some place else after taring it to protect it from being deleted.
 
1 members found this post helpful.
Old 09-08-2014, 04:11 PM   #6
rjo98
Senior Member
 
Registered: Jun 2009
Location: US
Distribution: RHEL, CentOS
Posts: 1,668

Original Poster
Rep: Reputation: 46
ah, that's a good idea to exclude the dir i'm writing to, never thought of that, will have to look up how to do it. that's the kind of solution I was looking for, something that I could let do the work while i'm away then I can just do a big copy once I get there. Thanks!
 
Old 09-08-2014, 04:35 PM   #7
suicidaleggroll
LQ Guru
 
Registered: Nov 2010
Location: Colorado
Distribution: OpenSUSE, CentOS
Posts: 5,258

Rep: Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947
Usually I do something along the lines of:
Code:
rsync -aAXv --delete /* /backupdir --exclude={/dev/*,/proc/*,/sys/*,/tmp/*,/run/*,/mnt/*,/media/*,/lost+found,/.gvfs,/backupdir/*}
Where /backupdir is a new directory created for the backup.

But then again I like to have working copies of the backups rather than gigantic tarballs.
 
2 members found this post helpful.
Old 09-08-2014, 04:37 PM   #8
jlinkels
Senior Member
 
Registered: Oct 2003
Location: Bonaire
Distribution: Debian Wheezy/Jessie/Sid, Linux Mint DE
Posts: 4,493

Rep: Reputation: 635Reputation: 635Reputation: 635Reputation: 635Reputation: 635Reputation: 635
No, tar is not a good option here. You want to have an archive just in case. What happens if you want to see if that old file is there? You have to untar everything before you can look at the file. The best option here is rsync:

Code:
rsync -auv --exclude='/proc' --exclude='/sys' --exclude='/dev' / /path/to/destination/
.. and add anything else you want to exclude in another --exclude option.

jlinkels
 
1 members found this post helpful.
Old 09-08-2014, 11:50 PM   #9
rnturn
Senior Member
 
Registered: Jan 2003
Location: Illinois (Chicago area)
Distribution: Red Hat (8.0, RHEL5,6), CentOS, SuSE (10.x, 11.x, 12.2, 13.2), Solaris (8-10), Tru64, MacOS, Raspian
Posts: 1,106

Rep: Reputation: 64
Quote:
Originally Posted by jlinkels View Post
The best option here is rsync:

Code:
rsync -auv --exclude='/proc' --exclude='/sys' --exclude='/dev' / /path/to/destination/
.. and add anything else you want to exclude in another --exclude option.

jlinkels
Ya know... cpio never gets any of the love...

* Mount the backup device on, say, /mnt/archive.

* Create a file to hold a list of things you want to NOT archive, say "/tmp/ignore.lis":
Code:
^/mnt/archive
^/tmp
^/dev
^/proc
and any other trees you don't want copied.

* Then issue:
Code:
find / -depth -print0 | grep -v -f /tmp/ignore.lis | cpio --null -pVd /mnt/archive
(Omit the 'V' switch to eliminate the progress 'dots'.)

There's more than one way to skin this cat.

Later...
 
1 members found this post helpful.
Old 09-09-2014, 12:41 AM   #10
joe_2000
Member
 
Registered: Jul 2012
Location: Aachen, Germany
Distribution: Void, Debian
Posts: 808

Rep: Reputation: 216Reputation: 216Reputation: 216
Quote:
Originally Posted by jlinkels View Post
No, tar is not a good option here. You want to have an archive just in case. What happens if you want to see if that old file is there? You have to untar everything before you can look at the file. The best option here is rsync:
Generally I would agree, but the OP stated that he has no physical access to the machine and wants to run something remotely as a preparation. He also stated that this is a "just in case" backup and the customers don't even know what's on that machine. They'll most likely never need it. So tar *is* a good option.

To the OP: One advice I'd like to add though: When you pull the tarball to an external drive when you get there you may want to do a md5checksum of source tarball vs. destination tarball afterwards to make sure your copy is good. Using rsync for the copying of the tarball would also be a good idea as it should have built-in functionality to check the copied files directly (not sure which flag that was).
 
1 members found this post helpful.
Old 09-09-2014, 12:50 AM   #11
evo2
LQ Guru
 
Registered: Jan 2009
Location: Japan
Distribution: Mostly Debian and Scientific Linux
Posts: 5,753

Rep: Reputation: 1288Reputation: 1288Reputation: 1288Reputation: 1288Reputation: 1288Reputation: 1288Reputation: 1288Reputation: 1288Reputation: 1288
Hi,

Quote:
Originally Posted by jlinkels View Post
What happens if you want to see if that old file is there? You have to untar everything before you can look at the file.
Not true. You can list the files contained with
Code:
tar tf foo.tar.gz
Then you can extract just the file you want
Code:
tar xf foo.tar.gz some/file/bar.baz
Evo2.
 
1 members found this post helpful.
Old 09-09-2014, 01:14 AM   #12
rnturn
Senior Member
 
Registered: Jan 2003
Location: Illinois (Chicago area)
Distribution: Red Hat (8.0, RHEL5,6), CentOS, SuSE (10.x, 11.x, 12.2, 13.2), Solaris (8-10), Tru64, MacOS, Raspian
Posts: 1,106

Rep: Reputation: 64
You need the 'z' switch. Without it you'd need to use:
Code:
gzip -dc foo.tar.gz | tar xf - some/file/bar.baz
Edit: Hmm... tar actually doesn't need to have the 'z' switch specified. I doubt that's very portable across Unix-like OSes, though.

Last edited by rnturn; 09-09-2014 at 01:18 AM.
 
1 members found this post helpful.
Old 09-09-2014, 01:31 AM   #13
evo2
LQ Guru
 
Registered: Jan 2009
Location: Japan
Distribution: Mostly Debian and Scientific Linux
Posts: 5,753

Rep: Reputation: 1288Reputation: 1288Reputation: 1288Reputation: 1288Reputation: 1288Reputation: 1288Reputation: 1288Reputation: 1288Reputation: 1288
Quote:
Originally Posted by rnturn View Post
You need the 'z' switch.
Only if you are using an ancient version of tar.

From https://www.gnu.org/software/tar/man...node/gzip.html
Quote:
Reading compressed archive is even simpler: you don't need to specify any additional options as GNU tar recognizes its format automatically. Thus, the following commands will list and extract the archive created in previous example:
Code:
# List the compressed archive
$ tar tf archive.tar.gz
# Extract the compressed archive
$ tar xf archive.tar.gz
Cheers,

Evo2.
 
1 members found this post helpful.
Old 09-09-2014, 09:09 AM   #14
rjo98
Senior Member
 
Registered: Jun 2009
Location: US
Distribution: RHEL, CentOS
Posts: 1,668

Original Poster
Rep: Reputation: 46
Wow, this post has really taken off, with LOTS of good ideas. So many ways to accomplish this, I'm actually not feeling as silly that I made my OP now. Thanks everybody!
 
Old 09-09-2014, 12:51 PM   #15
rnturn
Senior Member
 
Registered: Jan 2003
Location: Illinois (Chicago area)
Distribution: Red Hat (8.0, RHEL5,6), CentOS, SuSE (10.x, 11.x, 12.2, 13.2), Solaris (8-10), Tru64, MacOS, Raspian
Posts: 1,106

Rep: Reputation: 64
Quote:
Originally Posted by evo2 View Post
Only if you are using an ancient version of tar.
Yeah... I noticed that. I'm so used to having to use the 'z' switch while on commercial Unices that I still specify it on Linux.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] wiping out data for server decommission ksri07091983 Other *NIX 16 02-28-2014 03:22 PM
Decommission NIS server thekillerbean Linux - Software 1 04-11-2013 03:24 AM
Copy and delete files during backup process zetrotrack000 Linux - General 14 11-30-2012 06:49 AM
LXer: How to use rsync to copy files, folders, and to backup in Linux LXer Syndicated Linux News 0 03-01-2011 01:00 AM
backup files, tar, do i need to make a copy dtra Linux - Software 3 05-10-2005 09:52 PM


All times are GMT -5. The time now is 03:53 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration