[SOLVED] BASH: limit a .tar.gz to a maximum file size
ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
I am working on writing a backup script. It is cloud computing so what I have is several web servers that rsync to a directory located in cloud storage. I have mounted the cloud storage mounted at /mnt/cloudstorage on all of my cloud servers. The cloud storage is a samba share (which I didn't get to choose).
I need to work out a script to back up the directory on cloud storage that all the web servers rsync to. I figure this will be a manual process since changes to this directory will not be scheduled or automated. Few challenges, 1.) I can only access the cloud storage from inside the cloud providers network, and 2.) I need to be able to download the backups and store them here locally.
So with that being said, I will run the backup script on one of the web servers and then use a tool like winscp or something to download the files. What I am thinking is, since we do not have a box here that we can dedicate to storing these backups, I am going to need to put them on some external media. Since everyone here uses laptops, my first thought was either a USB drive or DVDs. So I want to limit the maximum file size to 4400MB to fit on a single layer DVD. At home I have rar on all of my machines and I just make tar.rar files for my backups. However rar is not standard on an installation of CentOS 5.3.
So with using only packages like tar, gzip, etc that were installed with CentOS, how can I limit the size of the back up files to 4400MB?
I just figured out using multi-volume won't work unless I am changing the destination media when it prompts. I guess split is the way to go.
Actually, there is:
-L, --tape-length=NUMBER change tape after writing NUMBER x 1024 bytes
When you use this option, you do not have to specify the multi-volume option because it will automatically be used with the above option. I have used it on my hard drive with no issues. However, I guess I should point out that when I used it, the tar file never reached the limit specified by the -L, (or --tape-length=NUMBER) option, but it might be worth a try for some users. ;-)
Also, not sure, but I do not think you can use compression employing this method, so usage of the 'split' command, as suggested by another user in this thread, is a far-out and groovy thing as well, since you can use compression with that method, can you dig it?