LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


Reply
  Search this Thread
Old 10-22-2014, 02:49 PM   #1
Completely Clueless
Member
 
Registered: Mar 2008
Location: Marbella, Spain
Distribution: Many and various...
Posts: 899

Rep: Reputation: 70
Question Imaging a Linux partition


Hi all,

Has a reliable & stable utility been developed yet that allows one to image only the used portion of a Linux partition?
Just wondering if there's anything out there now that saves one having to waste storage and time on saving an entire partition (or drive) when oftentimes 80% is free space we don't need to waste resources on backing up.
Any suggestions?
Thanks,
cc.
 
Old 10-22-2014, 04:30 PM   #2
littleball
Member
 
Registered: Jan 2011
Distribution: Slackware, Red Hat Enterprise
Posts: 47

Rep: Reputation: 8
I don,t know if a utility has been developed (or even exists), but I do know that what you request can be possible using rsync to mirror data locally or remotely. This way, you can use just the exact portion of the used linux disk, and the rest of the disk (in case space is left) you can use it for whatever you want. Is not difficult, if you,re interested:

http://www.linuxquestions.org/linux/...etween_servers

Keep in mind that if you want to sync 2 different servers, you will need a dedicate nic interface for them with at least 1Gb speed, if it is local disks, you need to consider how to limit buffer cache I/O request to not overload your server, some google search can help you with it.

 
Old 10-22-2014, 05:21 PM   #3
syg00
LQ Veteran
 
Registered: Aug 2003
Location: Australia
Distribution: Lots ...
Posts: 21,126

Rep: Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120
To do this the imager needs to filesystem-aware - so why not use a "proper" tool yourself ?.
partimage is a good example - but doesn't support ext4 or btrfs. No good to me, or likely a large proportion of current Linux users.
 
Old 10-23-2014, 04:43 PM   #4
jefro
Moderator
 
Registered: Mar 2008
Posts: 21,980

Rep: Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625
I like to use gparted too. Click and copy.

A reason to make partitions at install time is also a reason one might wish to make quick data backups. A partition for /home makes it easier to copy. Copy with any tool is all the same choices. Either file by file or bit by bit. You can easily copy mount points that are not partitions too by a large amount of tools. Tar, rsync, cpio, and many others can do file by file. Generally one pipes those to some compression to save transfer and ultimate end size.
 
Old 10-23-2014, 05:47 PM   #5
rknichols
Senior Member
 
Registered: Aug 2009
Distribution: Rocky Linux
Posts: 4,779

Rep: Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212
clonezilla is often used for that. It supports a wide variety of filesystems. clonezilla uses partclone internally, and you can use partclone directly if you don't want/like the clonezilla wrapper.

Last edited by rknichols; 10-23-2014 at 05:50 PM.
 
Old 10-24-2014, 07:50 AM   #6
Completely Clueless
Member
 
Registered: Mar 2008
Location: Marbella, Spain
Distribution: Many and various...
Posts: 899

Original Poster
Rep: Reputation: 70
Unhappy

Quote:
Originally Posted by rknichols View Post
clonezilla is often used for that. It supports a wide variety of filesystems. clonezilla uses partclone internally, and you can use partclone directly if you don't want/like the clonezilla wrapper.
I got fed up with Clonezilla. It never seemed to improve between versions. Really obvious things like options that go nowhere and cause the user to end up in endless loops were never tidied up. I went back to dd>tar in the end out of sheer exasperation. Gets the job done (eventually) but you end up with the wasted space problem. Arghh!!!
 
Old 10-24-2014, 03:08 PM   #7
ron7000
Member
 
Registered: Nov 2007
Location: CT
Posts: 248

Rep: Reputation: 26
you don't save the entire partition, you only save the file system contents.

for example,
/dev/sda1 = fat, 212MB, this is the boot partition if you are using EFI / ELILO and not GRUB.
/dev/sda2 = ext3, 200GB, this is mounted as root file system /.
/dev/sda3 = ext3, 100GB, just for kicks you have a separate partition mounted as /home, can be whatever.
this list goes on and on.

for any file system, say your /home file system on sda3 which is partitioned as 100GB, but you only are using 3 GB of it with all the data on it.
why save the entire 100GB partition when only 3% of it has anything useful?
you don't.
you use the 'tar' command and tar the filesystem into a single file and save just that like so:
"cd /home"
"tar -cf /root/myhome.tar *"
this will create a file that's 3GB in size called myhome.tar located in the /root folder on a different partition.
the catch with this obviously is if you want to save everything on a partition (the data not every byte including free space) you can't write that archive file to the same partition.
and when you want to archive the root file system, you cannot do it on a running system do you need to slave the disk to a running system and mount it.

once you have your archive file, which has just the contents from whatever partition,
you can also gzip that tar file to compress it,
when you are ready to restore that information all you do is make a new partition somewhere or reformat one, then do:
"cd /home"
"tar -xf /root/myhome.tar ." {going from my previous tar example}

the other catch is, if you created your tar file on an ext3 file system, then you need to restore it onto an ext3 file system I believe. Or the file system where you are restoring the archive to has to be the same as when you used the tar command to begin with.
this is especially true if you're cloning disks and writing the root file system to a new partition, the file systems (ext3, ext4, xfs, whatever) have to be the same. you can't tar a root file system that was ext3 then untar it to a mounted file system that is xfs, it would have to be ext3.


and if your file system is XFS, that file system comes with xfsdump and xfsrestore which does the same thing as i described, but i believe allows you to create the .xfsdump file on the same partition you are archiving even if it's the root folder of a running system.

Last edited by ron7000; 10-24-2014 at 03:13 PM.
 
Old 10-29-2014, 12:34 PM   #8
Completely Clueless
Member
 
Registered: Mar 2008
Location: Marbella, Spain
Distribution: Many and various...
Posts: 899

Original Poster
Rep: Reputation: 70
Thank you, Ron, but I've already hit on the solution of using dd to write a huge file full of zeroes which maxes out the unused space on the drive (or partition), then deleting this file. So although I'm still backing up the entire drive (or partition) if I pipe it via gzip (or some other compression utility) all those GBs of contiguous zeroes compress to next-to-nothing. Using this method, I find a 122Gb partition with 3Gb of system on it comes out at about 2.5Gb of archive. I'm totally happy with that. And it's independent of file systems!
 
1 members found this post helpful.
Old 10-31-2014, 07:19 AM   #9
rtmistler
Moderator
 
Registered: Mar 2011
Location: USA
Distribution: MINT Debian, Angstrom, SUSE, Ubuntu, Debian
Posts: 9,882
Blog Entries: 13

Rep: Reputation: 4930Reputation: 4930Reputation: 4930Reputation: 4930Reputation: 4930Reputation: 4930Reputation: 4930Reputation: 4930Reputation: 4930Reputation: 4930Reputation: 4930
Quote:
Originally Posted by Completely Clueless View Post
Thank you, Ron, but I've already hit on the solution of using dd to write a huge file full of zeroes which maxes out the unused space on the drive (or partition), then deleting this file. So although I'm still backing up the entire drive (or partition) if I pipe it via gzip (or some other compression utility) all those GBs of contiguous zeroes compress to next-to-nothing. Using this method, I find a 122Gb partition with 3Gb of system on it comes out at about 2.5Gb of archive. I'm totally happy with that. And it's independent of file systems!
Glad you got there. In reading the thread I kept thinking why not just use dd; only you'd have to be well versed not just in the partitions, but also the blocks assigned to them. Looks like you beat me to the punchline.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
DD Imaging of Builds and Resizing Larger Partition metallica1973 Linux - General 1 11-13-2010 04:36 AM
How to restore deleted Fedora OS partition running FOG imaging application CTOP Linux - Newbie 8 08-21-2009 11:40 AM
Looking for stable solid disk/partition imaging software cuco76 Linux - Software 4 09-14-2007 07:15 PM
Partition/disk imaging software fmillion Linux - Software 1 05-16-2007 01:20 PM
Partition re-arranging, Imaging and Linux 1kyle Linux - Software 0 03-17-2004 03:06 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 02:45 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration