Advanced file tree archiver needed
I need to archive some big file trees. The catch is that these trees do have a lot of socket file types, which tar cannot handle, and they also have files larger than 4GB (largest is 115GB) that cpio cannot handle. These also need to be streamed over the network while generated, so zip is not an option. Any ideas before I go off and design a new and better archiving format?
|
dar is competent. What's special about streaming over a network?
EDIT: IDK if dar is able to handle the specific requirements you mentioned but it might be worth a look. Regards "streaming over a network" could that mean writing to a networked file system? |
Quote:
Quote:
Quote:
I was not able to determine if it supports file sizes beyond 4GB and/or sockets. It's inability to write to stdout is a showstopper, so like a program that detects an error and exits immediately, I quit looking at it any further. Quote:
EDIT: DAR is stated in Wikipedia's Comparison_of_archive_formats to be based on TAR. Thus, it may not support sockets. |
Thanks for further information.
dar can write to stdout. From the dar man page (my bolding): Code:
-c, --create [<path>/]<basename> |
Quote:
Code:
( cd /some/test/data && dar -c - ) | ( cd /where/to/save && dar -x - ) |
Quote:
|
Quote:
The commands "dar -l -" and "dar -x -" do seem to try to work. But they do give error messages: Code:
lorentz/root /root 85# ( cd /home/tmp/test-data-to-send && dar -c - ) | ( cd /home/tmp/area-to-receive-data && dar -l - ) I'm already starting the process of designing a new portable and open file collection archive/stream format. |
All times are GMT -5. The time now is 03:18 AM. |