LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Newbie (https://www.linuxquestions.org/questions/linux-newbie-8/)
-   -   Best way to copy all files from server for backup before decommission (https://www.linuxquestions.org/questions/linux-newbie-8/best-way-to-copy-all-files-from-server-for-backup-before-decommission-4175517896/)

anon091 09-08-2014 02:49 PM

Best way to copy all files from server for backup before decommission
 
they have an old server that they're getting rid of, but they just want to make a copy of all the files off of it, as a "just in case". Not exactly knowing where everything might be tucked away on there, what's probably the best way to do that? make a .tgz of / ? can you even do that from the local box and have it write to that, or will that try to tgz itself and cause chaos?

I know, kind of a silly question and not great that they don't know where anything they might need is (which probably means they don't need anything haha)

joe_2000 09-08-2014 02:55 PM

What stops you from tar.gz ing everything to an external hard drive?
Alternatively you could clone the drive with e.g. clonezilla. (Never used it myself but heard good things of it).

thesnow 09-08-2014 03:01 PM

How much data? Is this a physical or virtual server?

anon091 09-08-2014 03:01 PM

I'm not at the same physical location to plug a drive in. Right now all I have is remote connection to the machine. But eventually the data will have to go to an external drive, or to another machine, was just wondering if there was anything I could do in the meantime till I was in the same location.

---------- Post added 09-08-14 at 04:02 PM ----------

It's an old physical server, so like 40GB of data I think. The server has more than double that in free space.

joe_2000 09-08-2014 03:07 PM

Well in that case tar gz / to a tarball in /tmp and exlude /tmp. Then you are sure it doesn't get confused.
When you get there you'll only have to pull the tarball. If the machine is at risk to be rebooted in the meanwhile move the tarball some place else after taring it to protect it from being deleted.

anon091 09-08-2014 03:11 PM

ah, that's a good idea to exclude the dir i'm writing to, never thought of that, will have to look up how to do it. that's the kind of solution I was looking for, something that I could let do the work while i'm away then I can just do a big copy once I get there. Thanks!

suicidaleggroll 09-08-2014 03:35 PM

Usually I do something along the lines of:
Code:

rsync -aAXv --delete /* /backupdir --exclude={/dev/*,/proc/*,/sys/*,/tmp/*,/run/*,/mnt/*,/media/*,/lost+found,/.gvfs,/backupdir/*}
Where /backupdir is a new directory created for the backup.

But then again I like to have working copies of the backups rather than gigantic tarballs.

jlinkels 09-08-2014 03:37 PM

No, tar is not a good option here. You want to have an archive just in case. What happens if you want to see if that old file is there? You have to untar everything before you can look at the file. The best option here is rsync:

Code:

rsync -auv --exclude='/proc' --exclude='/sys' --exclude='/dev' / /path/to/destination/
.. and add anything else you want to exclude in another --exclude option.

jlinkels

rnturn 09-08-2014 10:50 PM

Quote:

Originally Posted by jlinkels (Post 5234552)
The best option here is rsync:

Code:

rsync -auv --exclude='/proc' --exclude='/sys' --exclude='/dev' / /path/to/destination/
.. and add anything else you want to exclude in another --exclude option.

jlinkels

Ya know... cpio never gets any of the love... :D

* Mount the backup device on, say, /mnt/archive.

* Create a file to hold a list of things you want to NOT archive, say "/tmp/ignore.lis":
Code:

^/mnt/archive
^/tmp
^/dev
^/proc

and any other trees you don't want copied.

* Then issue:
Code:

find / -depth -print0 | grep -v -f /tmp/ignore.lis | cpio --null -pVd /mnt/archive
(Omit the 'V' switch to eliminate the progress 'dots'.)

There's more than one way to skin this cat.

Later...

joe_2000 09-08-2014 11:41 PM

Quote:

Originally Posted by jlinkels (Post 5234552)
No, tar is not a good option here. You want to have an archive just in case. What happens if you want to see if that old file is there? You have to untar everything before you can look at the file. The best option here is rsync:

Generally I would agree, but the OP stated that he has no physical access to the machine and wants to run something remotely as a preparation. He also stated that this is a "just in case" backup and the customers don't even know what's on that machine. They'll most likely never need it. So tar *is* a good option.

To the OP: One advice I'd like to add though: When you pull the tarball to an external drive when you get there you may want to do a md5checksum of source tarball vs. destination tarball afterwards to make sure your copy is good. Using rsync for the copying of the tarball would also be a good idea as it should have built-in functionality to check the copied files directly (not sure which flag that was).

evo2 09-08-2014 11:50 PM

Hi,

Quote:

Originally Posted by jlinkels (Post 5234552)
What happens if you want to see if that old file is there? You have to untar everything before you can look at the file.

Not true. You can list the files contained with
Code:

tar tf foo.tar.gz
Then you can extract just the file you want
Code:

tar xf foo.tar.gz some/file/bar.baz
Evo2.

rnturn 09-09-2014 12:14 AM

You need the 'z' switch. Without it you'd need to use:
Code:

gzip -dc foo.tar.gz | tar xf - some/file/bar.baz
Edit: Hmm... tar actually doesn't need to have the 'z' switch specified. I doubt that's very portable across Unix-like OSes, though.

evo2 09-09-2014 12:31 AM

Quote:

Originally Posted by rnturn (Post 5234735)
You need the 'z' switch.

Only if you are using an ancient version of tar.

From https://www.gnu.org/software/tar/man...node/gzip.html
Quote:

Reading compressed archive is even simpler: you don't need to specify any additional options as GNU tar recognizes its format automatically. Thus, the following commands will list and extract the archive created in previous example:
Code:

# List the compressed archive
$ tar tf archive.tar.gz
# Extract the compressed archive
$ tar xf archive.tar.gz


Cheers,

Evo2.

anon091 09-09-2014 08:09 AM

Wow, this post has really taken off, with LOTS of good ideas. So many ways to accomplish this, I'm actually not feeling as silly that I made my OP now. Thanks everybody!

rnturn 09-09-2014 11:51 AM

Quote:

Originally Posted by evo2 (Post 5234746)
Only if you are using an ancient version of tar.

Yeah... I noticed that. I'm so used to having to use the 'z' switch while on commercial Unices that I still specify it on Linux.


All times are GMT -5. The time now is 04:56 AM.