LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 09-05-2005, 06:09 AM   #1
recce101
LQ Newbie
 
Registered: Sep 2005
Distribution: Ubuntu
Posts: 2

Rep: Reputation: 0
Split large ntfsclone images?


I'm attempting to backup a large NTFS partition with ntfsclone, saving the image file to a FAT32 USB external hard drive. The NTFS partition is about 18gb total with just under 5gb of data. Saving in the ntfsclone special image format is unsuccessful because of the FAT32 4gb filesize limit. Using ntfsclone with gzip compression...

cd /mnt/maxtor/backup/home/win2000

ntfsclone -s -o - /dev/hda1 | gzip -c > 050904.img.gz

produces a smaller file, but the compressed data is reported to be invalid when checked by the ntfsclone test option (this has happened repeatedly).

I have successfully backed up and restored NTFS partitions with less than 4gb of data using the ntfsclone special image format. For partitions with more than 4gb of data, is it possible to use the special image format and split the output into two or more smaller (less than 4gb) files? If so, what would be the command syntax for backup and for restore?

Thank you!
 
Old 09-05-2005, 06:31 AM   #2
Electro
LQ Guru
 
Registered: Jan 2002
Posts: 6,042

Rep: Reputation: Disabled
You can include the utility split with gzip. Another utility you can use is tar with out the need to use split. I suggest stopping at every gigabyte with FAT32.
 
Old 09-05-2005, 11:20 AM   #3
recce101
LQ Newbie
 
Registered: Sep 2005
Distribution: Ubuntu
Posts: 2

Original Poster
Rep: Reputation: 0
Quote:
Originally posted by Electro
You can include the utility split with gzip. Another utility you can use is tar with out the need to use split. I suggest stopping at every gigabyte with FAT32.
My basic question is, can I use split without compression, i.e., split the ntfsclone "special image format" (which from what I read is the preferred format)?
 
Old 09-05-2005, 11:53 AM   #4
Snowbat
Member
 
Registered: Jun 2005
Location: q3dm7
Distribution: Mandriva 2010.0 x86_64
Posts: 338

Rep: Reputation: 31
Of course:
ntfsclone -s -o - /dev/hda1 | split -b 4000m -

You can use cat to join the files later. See http://www.linuxquestions.org/questi...icle&artid=227

Last edited by Snowbat; 09-05-2005 at 11:55 AM.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Split large file in several files using scripting (awk etc.) chipix Programming 14 10-29-2007 11:16 AM
Split Large Very Files (Software) kolmogorov Solaris / OpenSolaris 5 11-18-2005 11:46 AM
Split large file into multiples jdozarchuk Linux - Newbie 1 11-04-2004 09:42 AM
Tiled printing of large images in Linux? flieslikeabeagl Linux - Software 1 11-03-2004 02:32 PM
split a large mpeg file into two zstingx Linux - General 3 11-06-2003 06:26 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 12:30 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration