How to tgz all folders individually
I have a /data folder. Below that are 65 folders (that contain subfolders), is there a slick command I could run to create 65 foldername.tgz files so i have a tgz backup of each directory and all it's subcontents individually? I'd like to avoid typing 65 individual commands :)
|
Yes.
Oh, you want the command... First set your working directory to the directory where the directories you want backed up. find * -maxdepth 0 -type d -exec tar -czf '{}'.tgz '{}' ';' should do it. The result should be 65 tar files. If you want a brief log of the activity, you can add "-print" between the the "-type d" and the "-exec". There is a variant that does things in parallel (though it might not be good for 65...): Code:
find * -maxdepth 0 -type d -print | while read V; do It can be faster as overlapped read/writes get done in parallel, but can overload a single processor. In both cases, if you want the target tgz file somewhere other than in the working directory, include a path in front of the tgz file. |
Thanks. I think the parallel one may bring the server to it's knees haha, they're all pretty big directories each.
|
Possibly. I can see doing two to maybe four at a time without a problem, but coding up the waits is a bit of a pain.
Using "make" is an alternative, but creating the Makefile might be tricky.... (it has a nice -j option to limit the number of parallel actions). |
Understood. Thanks again for the help. Going to try it out tonight when the load on the server is lighter.
|
Quote:
|
Quote:
Screen is nice, but there are times it doesn't do well... and losing output is one of the things that can happen. |
ha, thanks guys.
|
All times are GMT -5. The time now is 09:07 AM. |