LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Software (http://www.linuxquestions.org/questions/linux-software-2/)
-   -   Splitting/Merging Large Files (http://www.linuxquestions.org/questions/linux-software-2/splitting-merging-large-files-946913/)

thund3rstruck 05-25-2012 08:14 PM

Splitting/Merging Large Files
 
I got knee deep in creating a Clonezilla image when it occurred to me that CloneZilla is not capable of creating split images (to fit on DVD +R DL discs). So I ended up cloning the 40GB disk image to a USB disk.

I need that 40GB of space back by getting this thing onto DVDs for permanent storage. Someone suggested using split/cat to create 8GB parts of the image and then burning each to DVD but this seems a little dangerous.

I mean the copy [file] > [file] command in Windows is ridiculously prone to corruption. On Windows I used to use a program called QuickPar that would split large files and create recovery pars that could be used to recover from data corruption.

Can someone recommend a safe, effective, and highly reliable way on Linux to split this image file into DVD +R DL (8GB) sized chunks (and obviously merge them back when the time comes to restore the system --which I intend on doing once a year or so).

foodown 05-25-2012 08:54 PM

Code:

split -b 8G input_file output_file_prefix
That will reliably divide the file into smaller chunks for you.

Code:

cat output_file_prefixaa > restored_full_file
cat output_file_prefixab >> restored_full_file
cat output_file_prefixac >> restored_full_file

(etc)

That will reliably restore the original file.

Test it out. It works.

It's not the Windows 'copy' command ... It's split and cat.
Nerds have been chunking out big ol' tar files and putting them back together with them for decades; they're good to go.

Note the use of '>>' for everything after the first chunk. That's important, because it specifies that the stream is appended to the end of the restored file. The '>' will just overwrite what's already there, and that's not what you want.

gnunandakumar 05-26-2012 08:32 AM

Hope a GUI Application for this purpose from me.

thund3rstruck 05-26-2012 03:40 PM

Quote:

Originally Posted by foodown (Post 4687835)
Code:

split -b 8G input_file output_file_prefix
That will reliably divide the file into smaller chunks for you.

Code:

cat output_file_prefixaa > restored_full_file
cat output_file_prefixab >> restored_full_file
cat output_file_prefixac >> restored_full_file

(etc)

That will reliably restore the original file.

Test it out. It works.

It's not the Windows 'copy' command ... It's split and cat.
Nerds have been chunking out big ol' tar files and putting them back together with them for decades; they're good to go.

Note the use of '>>' for everything after the first chunk. That's important, because it specifies that the stream is appended to the end of the restored file. The '>' will just overwrite what's already there, and that's not what you want.

Thanks! I'll give it a shot then.


All times are GMT -5. The time now is 10:37 AM.