LinuxQuestions.org
Review your favorite Linux distribution.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Security
User Name
Password
Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.

Notices


Reply
  Search this Thread
Old 05-04-2014, 02:00 PM   #1
postcd
Member
 
Registered: Oct 2013
Posts: 527

Rep: Reputation: Disabled
Modifying tar.gz / tar file to prevent extraction?


Hello,

im making backups of sensitive data to my VPS, i cant trust owner of the VPS node fully.
So i got an idea on which i need Your comments.

I can zip file with password, but i just discovered zip may not pack files larger 2gb, got some error like
"zip warning: name not matched: /backup/incremental/accounts/yzbuutpo/homedir/public_html/wp-content/themes/twentytwelve/bomba1/dovecot"

so i got idea renaming my backups to something uninteresting and without tar.gz extension.
and another level of protection - can i anyhow edit the archive easilly so non professional dont recognize what file type it is and also make it corrupted for extraction by editting somehow easilly archive content? If anyone point to a guide, im linux amateur? Thank you

PS i assume i can also use openssl to encrypt archive, but i think it would eat alot of resources to encrypt like 15gb file..

Last edited by postcd; 05-04-2014 at 02:02 PM.
 
Old 05-04-2014, 02:12 PM   #2
pan64
LQ Addict
 
Registered: Mar 2012
Location: Hungary
Distribution: debian/ubuntu/suse ...
Posts: 21,848

Rep: Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309
there is a simple command, named file which will identify the file. Renaming it will not have any effect, it will not protect you at all. Corrupting the archive may work, but you will need to know how to restore it (and that may cause problems). I would rather suggest you another approach, try to protect your data with a password, create encrypted archive: http://how-to.linuxcareer.com/using-...ages-and-files
 
Old 05-05-2014, 03:35 AM   #3
gengisdave
Member
 
Registered: Dec 2013
Location: Turin, Italy
Distribution: slackware
Posts: 328

Rep: Reputation: 74
you can also use luks loopback encryption, but you have to set disk size (then you have a space limit)

http://wiki.centos.org/HowTos/EncryptedFilesystem
 
Old 05-05-2014, 04:20 AM   #4
ruario
Senior Member
 
Registered: Jan 2011
Location: Oslo, Norway
Distribution: Slackware
Posts: 2,557

Rep: Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762
Quote:
Originally Posted by postcd View Post
zip may not pack files larger 2gb
The ZIP file format handles files larger than 2Gb just fine (and has done since 2001), as long as you are using a zip application that supports ZIP64. Anything that handles the zip specification 4.5 or higher should be fine. Unfortunately many applications that claim zip support only support version 2.0 of the zip specification.
 
Old 05-05-2014, 04:23 AM   #5
ruario
Senior Member
 
Registered: Jan 2011
Location: Oslo, Norway
Distribution: Slackware
Posts: 2,557

Rep: Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762
If you do go with zip, make sure your zip application can do AES encryption (part of the zip file specification since 5.1). The traditional, zip password-based symmetric encryption is flawed and easily broken.

Last edited by ruario; 05-05-2014 at 04:57 AM.
 
Old 05-05-2014, 04:33 AM   #6
ruario
Senior Member
 
Registered: Jan 2011
Location: Oslo, Norway
Distribution: Slackware
Posts: 2,557

Rep: Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762
If you do not care about retaining all Linux file meta data (e.g. ownership) you could use 7zip, which can also do AES encryption in addition to compression.

Alternatively you could also use 7zip to create a zip archive, rather than an 7z archive. Using the zip file format, rather than 7z format means you have more extraction options in the future.

7zip's zip support does not include InfoZIP's latest UID/GID support extensions but will store some file meta data (i.e. permissions), whilst taking advantage of ZIP64 features, LZMA compression and AES encryption. If all the files are owned by a single user, this might be good enough for you.

Here is an example of command line to create such an archive:

Code:
7za a -p -tzip -mm=LZMA -mem=AES256 zip_file_name.zip files

Last edited by ruario; 05-05-2014 at 05:03 AM.
 
Old 05-05-2014, 03:01 PM   #7
postcd
Member
 
Registered: Oct 2013
Posts: 527

Original Poster
Rep: Reputation: Disabled
Question

- files are not owned by one user. Its a backup of 50+ accounts of hosting clicents. each client directory contains files and folders with like clientname:cleintname

- openssl: good, but i doubt if i encrypt 15gb file would it take long time? IM NOT IDIOT to GZIP daily half million files and then also ENCRYPTING IT by some slow method.

I NEED SUPER FAST CREATING OF ONE FILE OUT OF AROUND HALF MILLION FILES. At same time some kind of protection so no simple stealer can extract any data out of it. Please advice command kindly. Thank you

Last edited by postcd; 05-05-2014 at 03:02 PM.
 
Old 05-05-2014, 03:26 PM   #8
ruario
Senior Member
 
Registered: Jan 2011
Location: Oslo, Norway
Distribution: Slackware
Posts: 2,557

Rep: Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762
Anyway you look at it compressing and encrypting that much data will take a while. You can use a super fast compressor like lzop (or drop compression altogether) to gain some speed up.

I take it you are set on a single archive container? If so, I would go with internal compression and encryption, i.e. on a per file basis rather than spanning he entire archive. That way you need only decompresses or decrypt the files as you require them. It is also safer this way as any corruption to the archive is more likely to be recoverable.

I would look at afio or dar. Afio is a tool with 24 years of heritage, Dar is a more modern tool. I prefer Afio personally as its syntax is similar to cpio and allows for arbitary selection of compressor and encryptor. Dar has fairly complex command line options and is limited to the compression and encryption methods that are built in.

P.S. To expand on the safety aspect, compressing files in a one by one basis as they are added to an archive (internal compression) as is the case with zip, 7z, afio or dar is safer for critical backups than external compression (wrapping gzip, bzip2, lzop, etc. around an entire tar archive).

Consider that if a gzipped archive has a single bit corrupted near its start the tar file stored within is effectively lost. This is because common compression algorithms depend on the coherency over a long sections of a file to achieve their results. If the file cannot be decompressed none of the archive's contents can be extracted. Indeed, should you ever need to attempt recovery an important gzipped file, read this to see exactly what is involved.

Last edited by ruario; 05-05-2014 at 11:55 PM. Reason: Added post script; s/command like/command line/
 
Old 05-05-2014, 11:14 PM   #9
replica9000
Senior Member
 
Registered: Jul 2006
Distribution: Debian Unstable
Posts: 1,126
Blog Entries: 2

Rep: Reputation: 260Reputation: 260Reputation: 260
What about compressing with tar.gz or txz on an ecryptfs directory? You simply mount one folder to another, mounting with ecryptfs, and anything written to that folder will be encrypted.

example:
Code:
mount -t ecryptfs ~/.private ~/user-data
While mounted, anything under user-data will appear normal, while anything under .private will be encrypted. You can then copy the data from .private to your VPS. If you need to decrypt the data, just mount the same (or another) directory with the same passphrase and options used during mount to encrypt it.
 
Old 05-05-2014, 11:52 PM   #10
ruario
Senior Member
 
Registered: Jan 2011
Location: Oslo, Norway
Distribution: Slackware
Posts: 2,557

Rep: Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762
Sounds like a plan. I still think a compressed tar is risky for important backups (as explained in my previous post). But the OP could forgo compression or use a format that supports internal compression (afio, dar, xar, etc.). Alternatively they could drop the container archive altogether and just recursively copy the files over, compressing them as needed.
 
Old 05-06-2014, 12:11 AM   #11
pan64
LQ Addict
 
Registered: Mar 2012
Location: Hungary
Distribution: debian/ubuntu/suse ...
Posts: 21,848

Rep: Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309
you ought to make incremental backup, there is no need to store the same 15 GB again and again.
 
Old 05-06-2014, 04:31 AM   #12
ruario
Senior Member
 
Registered: Jan 2011
Location: Oslo, Norway
Distribution: Slackware
Posts: 2,557

Rep: Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762Reputation: 1762
Quote:
Originally Posted by pan64 View Post
you ought to make incremental backup, there is no need to store the same 15 GB again and again.
+1 to that.

Dar has built in support for differential backup. For afio you would need to construct a find command that only found newer files and you would still have the issue of how best to deal with file deletions.
 
Old 05-06-2014, 12:54 PM   #13
postcd
Member
 
Registered: Oct 2013
Posts: 527

Original Poster
Rep: Reputation: Disabled
I need local incremental backup and also to have an backup at external server which canot be read by unauthorized person. How to achieve in one or two commands?
 
Old 05-07-2014, 12:14 AM   #14
pan64
LQ Addict
 
Registered: Mar 2012
Location: Hungary
Distribution: debian/ubuntu/suse ...
Posts: 21,848

Rep: Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309Reputation: 7309
just google, you will find a lot of different solution. here is a tip with rsync: http://www.maketecheasier.com/make-i...ps-with-rsync/
 
Old 05-07-2014, 08:36 AM   #15
postcd
Member
 
Registered: Oct 2013
Posts: 527

Original Poster
Rep: Reputation: Disabled
i dont know the best way to integrate low load causing encryption to the command
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
how can i decompress this tar.tar file? hmmm sounds new.. tar.tar.. help ;) kublador Linux - Software 14 10-25-2016 02:48 AM
tar.bz2 file extraction and running help ultraviolet Linux - Newbie 3 01-07-2014 04:19 PM
"Invalid tar magic" error msg. when I try to tar ldmud *.tar file in DSL pixxi451 Linux - Newbie 4 07-04-2010 08:32 AM
BackUp & Restore with TAR (.tar / .tar.gz / .tar.bz2 / tar.Z) asgarcymed Linux - General 5 12-31-2006 02:53 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Security

All times are GMT -5. The time now is 02:16 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration