Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I would like to prepare an archive of directories used in common by many users, including root on a server.
The archive should be prepared by crontab and zip, if possible (I used these for the backup of the user's home directories).
My questions are:
- does this archive have to be prepared by su? (I suppose even root could not read all files there)
- if so, how can I prepare the superuser's crontab? (I already know how to prepare the user's crontabs). How can I ensure that a shell have su rights?
- when su extracts files from such an archive, will the extracted files be owned by su (or root) or by the users who originally owned them? (the latter is a must)
- is it possible to keep the original access rights even when the files are in the archive, i.e.:
- each user should be enabled to extract his own files from the archive, and
- no user should be allowed to extract or read the files of other users for which he did not have at least read access rights originally?
So, is it possible to prepare such an archiving system?
It is the second day that I examine this question and I begin to suspect that zip will not retain permissions, so I have to use tar + gzip.
Yet, after reading the tar and gzip manuals and info pages several times I was not able to successfully compress just one single file into tar.gz (except for those that I did not intend to).
Could anyone please tell me what command can I use to pack one directory and its subdirectories into a single, specific tar.gz file?
Moreover: (if I succeed) I will possibly have a huge archive containing thousands of files. Is it possible to update a tar.gz file with files that have changed, or the entire archive has to be re-created? This makes a great difference, since, if the archives of all users have to be recreated just because of one or two files that have changed, the whole weekend will not be enough for the work and the server will be heavily loaded.
Thank you in advance for your help.
Your right, you need to use Tar and Gzip... since GNU Tar shipped with most linux distros's include gzip compression, we'll just use tar.
You configure root's crontab just the same as you do a user's.. Just su to root and run crontab -e
The tar command would look something like:
tar -czf /somedir/backup.tar.gz /home/
You need not worry about the portability of your archive. Winzip will extract files and directories from a tar.gz file with no trouble.
In Unix, (and, by extension, Linux), there are no files that root cannot read or modify, regardless of ownership or permission
-c option means to create a new archive. z Specifies to use gzip compression. f instructs tar to use a file or device (as opposed to a stdin / stdout pipe)
You would extract that archive with command tar -xzf /somedir/backup.tar.gz
Note: If the archive will span GB, you may need to split the output.
tar -cz /home | split (split otions that I"m too lazy to lookup at the moment)..
This would create a series of files split at a certain size. To restore from such a beast:
cat files | tar -xz
You would not be able to use the -u update option that way (I don't think).
Also note, the -u will append any new files at the end of the archive, but does not remove the old file. This does not cause problems to restore, but the archive file will continually grow in size as each update to files gets appended.
And finally, to clarify. Tar is used to pack multiple files into a single file. gzip (or bzip2) is used to compress a file. If what you want to do is compress a single file, the command is simply gzip filename
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.