how to compress ~130GB folder and erase original files at the same time
Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
how to compress ~130GB folder and erase original files at the same time
Hello, I'm running FC6 on a dual core 2.8GHz pentium-D machine with 3.5GB RAM. I have about 8GB free on a 250GB hard disk, with one folder taking up about 130GB (almost entirely text data files). To free up some space, I want to compress this folder and get rid of the original files. But since the zip file will likely be larger than the 8GB free space in the hard disk, I can't simply zip it and then remove the original files. Is there a command to compress and remove original files at the same time on the fly? Or should I need to write a shell script that takes one file at a time, adds to an archive and deletes the original (I'm guessing this would take forever...)?
Why not just gzip the individual files one at a time:
cd <dir>
for file in *
do gzip $file
done
This will copy each "file" into a gzipped "file.gz" then remove the "file".
You really wouldn't want to delete the original file until the gzip compressed file is complete. What happens if the compression doesn't work (because you ran out of space for example)? You'd lose both the original and the in progress gz file. By doing the above it takes care of the deletion of the original file once the gz file has been created.
I think it's easier and safer to have lots of excess space----buy a a big external USB drive if you have to (storage is cheap: http://www.newegg.com/Store/SubCateg...al-Hard-Drives ) Create your .tar.gz archive, check it, make a backup, and then delete the original stuff.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.