Download your favorite Linux distribution at LQ ISO.
Go Back > Forums > Linux Forums > Linux - Newbie
User Name
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!


  Search this Thread
Old 04-20-2009, 01:22 PM   #1
LQ Newbie
Registered: Apr 2009
Distribution: CentOS 5.3, Ubuntu 8.0.4LTS
Posts: 4

Rep: Reputation: 0
how to compress ~130GB folder and erase original files at the same time

Hello, I'm running FC6 on a dual core 2.8GHz pentium-D machine with 3.5GB RAM. I have about 8GB free on a 250GB hard disk, with one folder taking up about 130GB (almost entirely text data files). To free up some space, I want to compress this folder and get rid of the original files. But since the zip file will likely be larger than the 8GB free space in the hard disk, I can't simply zip it and then remove the original files. Is there a command to compress and remove original files at the same time on the fly? Or should I need to write a shell script that takes one file at a time, adds to an archive and deletes the original (I'm guessing this would take forever...)?

Thanks in advance.
Old 04-20-2009, 01:58 PM   #2
LQ Guru
Registered: May 2005
Location: Atlanta Georgia USA
Distribution: Redhat (RHEL), CentOS, Fedora, CoreOS, Debian, FreeBSD, HP-UX, Solaris, SCO
Posts: 7,499
Blog Entries: 15

Rep: Reputation: 1463Reputation: 1463Reputation: 1463Reputation: 1463Reputation: 1463Reputation: 1463Reputation: 1463Reputation: 1463Reputation: 1463Reputation: 1463
Why not just gzip the individual files one at a time:

cd <dir>
for file in *
do gzip $file

This will copy each "file" into a gzipped "file.gz" then remove the "file".

You really wouldn't want to delete the original file until the gzip compressed file is complete. What happens if the compression doesn't work (because you ran out of space for example)? You'd lose both the original and the in progress gz file. By doing the above it takes care of the deletion of the original file once the gz file has been created.
Old 04-20-2009, 04:57 PM   #3
LQ Newbie
Registered: Apr 2009
Distribution: CentOS 5.3, Ubuntu 8.0.4LTS
Posts: 4

Original Poster
Rep: Reputation: 0
This is definitely a most interesting solution that I have not thought of. Thank you very much.
Old 04-20-2009, 06:48 PM   #4
LQ Veteran
Registered: Nov 2005
Location: Annapolis, MD
Distribution: Arch/XFCE
Posts: 17,802

Rep: Reputation: 738Reputation: 738Reputation: 738Reputation: 738Reputation: 738Reputation: 738Reputation: 738
I think it's easier and safer to have lots of excess space----buy a a big external USB drive if you have to (storage is cheap: ) Create your .tar.gz archive, check it, make a backup, and then delete the original stuff.


compression, folder

Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
compress folder in rar infonlinebr Linux - Newbie 5 03-17-2016 02:08 PM
Transfer 1000000 files (130GB) server to server via shell masali Linux - Software 2 04-04-2007 01:08 PM
to compress a file or folder sanjay2004 Linux - Newbie 1 04-11-2006 08:28 AM
How to compress a folder deWin Linux - Newbie 3 11-11-2004 10:00 PM
How to keep the original time of files uploaded to FTP server wondial Linux - Software 1 10-26-2004 10:12 PM > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 10:33 AM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration