Linux - DesktopThis forum is for the discussion of all Linux Software used in a desktop context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
The hassle is to get out a single file with unknown state from all those awesome backup solutions.
And all are in plain data. When the harddrive fails, you have a harddrive with your data on it. Anyone can read it.
I bought an used laptop. I could read all teh data from the windows 10 guy, because he was an apple user and was too stupid to wipe the data properly. CV, bank details, online accounts, bills, islam koran study (no comment on that), full samsung phone backups. Well someone with 22 years and 40.000 Euros debt to teh bank is an idiot anyway. With a failed business.
I prefer my copy approach with a certain state. My approach to just plugin my backup drive, redo the bootloader and just use it.
All those backup solutions are really nice when you look at them from the outside. I use a few in the past. Complicated manpages, weird disc structure and such. And when you want to pull out a single file it is a headache.
Not to forget. They are not full disc backups. They are just data backups. So with only one single machine, you need to setup magically the operating system first.
--
Acronis true image is the best soulution on not how to do it.
You need a working system. Works only on certain operating systems, not upwards compatible. Binary blob file which is backuped.
I went through a few rsync based. And I was not amazed on how easy to use they were. Complicated stuff from programmers for programmers. not for the end user who seeks an easy solution.
And all those guides, you need to not backup proc and other folders, this and that
Maybe the gentoo geeks were idiots in the past, or I am an idiot. I could not crasp the core principle on how to use it easily and effictely
--
I move my physical extents several times while I was using that box in the past. The reason why I started using lvm. LVM2 is very complicated but I managed it. Its worth it. An installation without lvm is flawed in my point of view.
Than I realized, broken drive. Hmm data is still on it? maybe encryption. took me also a while but i had several gentoo boxes wiht my custom adapted initramfs
Linux is not like Windows. Were you plugin a cd, and download lots of binary crap from NSA.
--
the acronis true image is such approach for noobies, who will
reintall windows first ~3 hours or more in the past. even with windows 10 on an asus g75vw, ssd, i7-3610qm, 16GB of DDR3L 1600. ~1-2 hours. until its finally useable, so many stupid dialogs.
than i had to install acronis trueimage from a cd in the past, than plugin the drive. Than pull all my data from external usb 2.0 case
I assume one day is dead for that procedure in the past. acronis true image, windows, external usb 2.0 case with ide HDD.
--
I just need 10 minutes honestly to change the drive. I do it slowly so I do not cause any defekts, any loose screws and such.
20 minutes for the os when i am very slow, rebooting the livecd takes ages, than some commands and i am done.
backup takes an hour, including swapping the drives. I usually have 60gib of data, I copy from ssd to external ssd over uasp usb 3.0. that includes redo the file systems, booting the slow live-cd and such.
--
A backup takes time. a backup needs to be verified instantly. i verify it every time. the notebook drive gets on the shelf. the external drive is put in my notebook and made bootable. i see instantly if it works or not
A backup needs to be labeled accordingly, I do it wiht a bit of tape and a piece of written paper.
A backup needs to be understandable. A backup needs to be instantly useable.
--
those rsync based are just for guys who have a 100 or a 1000 boxes running. who just need to safe some files.
worthless for the end user.
--
i had to wipe some acronis backup files.i could not install the software anymore, so the data is gone
--
my backup just needs any linux with lvm2 / luks / amd64 / bash => e.g,. sysrescue livecd is just one choice.
--
My backup is a bit complicated because I care for data security. without lvm2 and luks. it would be much simpler and faster
Cons: 3x 60 Euros for 120 or 128GB SATA SSDs. One is in use, 2 are on the shelf.
OFC I have 1TB very old 2.5" SATA drive also in use, but thats just for junk, like gentoo distfiles, and some other things i do not care.
---
MAybe git could be used for the hole drive. But I never really bothered reading the docs.
Git is quite often in use.
No wonder if Git could speed up things.
When I understood it correctly, you have some timestamps and just sync to the next timestamps.
I will agree that with all the others I have seen or used, except perhaps for clonezilla, all you have are files (not necessarily data). But then that is all I want. If I am restoring a backup I may very well NOT want the partition table, etc., but only the various files and programs. As for you other points . . . . If you are concerned about encryption there is always veracrypt that can encrypt files, partitions or entire disks. If you are really paranoid as perhaps we should be, veracrypt can also provide 'deniability' (i.e. you can reasonably deny there is any data on the disk). As to difficulty in restoring or getting a single file, although many I have seen are guilty of that. Many others, especially those like rsnapshot that are based on rsync are not. With rsnapshot (and I assume others based on rsync) it is just a matter of copying a file or files from one directory to another and I think we can all do that. As to swapping backup drives or disks or whatever, with a little creativity there should be little problem. Now setup and configuration can be a little daunting at first, depending upon what you are trying to accomplish. But I have seen none that were impossible to figure out. After all many programs are difficult to configure the first time through. After you have used it for a bit it usually becomes fairly simple with most of them. And your criticism of rsync is, I believe, misplaced. However, my opinion does not matter as far as your work is concerned. If it doesn't work for you, that's fine, but it does not mean that is true for everyone.
I just started using Borgbackup which is an Attic fork and it’s working great so far. Just my two cents. I typically used rsync before.
Another borg user here.
started using it after i messed up royally about half a year ago, been working ever since.
i even retrieved a few files since then.
Thank you for your replies. I prefer to make backups images because usually there is a some kind of compression. I have 2 Tb drive on which I save backup images from different machines (home laptop, home desktop, business laptop...). I will check clonezilla first.
Thanks again.
I have used Acronis in the past for Windows image backups, so I know what you mean. Clonezilla gives the same capabilities and options as far as I can tell. The user interface is not as slick as Acronis, but it gets the job done. Only once (so far) did I have a need to restore from a clonezilla image, and it worked as expected.
indeed, I'm not familiar with it, but I'll do a research. Like I said, option to copy the whole disk is very unpractical for me, as I use one external drive to store backup images of different personal and business computers.
With rsync, it's easy to backup several computers to one external (or remote) drive. You do not need to dedicate one drive per computer. You backup each computer to its own directory on the backup drive.
The choice of rsync vs. clonezilla, in my view, depends on what level you need to backup and restore. If you need to backup and restore whole drives or partitions, that's what clonezilla does. If you need to backup and restore files and directories, anywhere from a single file to the contents of a directory, even to the entire content of a filesystem, that's what rsync does.
One crucial difference: rsync doesn't backup or restore the boot sector or partition structure of a drive, or the format of a partition. It doesn't create or format filesystems. It just copies the contents from one directory (which can be a whole filesystem) to another directory or filesystem that is already set up.
I use both: clonezilla for drives, so I can recover from hard drive failure or serious malware, and rsync for drive contents, so I can recover just a file or directory into an otherwise healthy computer.
Last edited by Beryllos; 12-12-2017 at 10:59 PM.
Reason: added the last paragraph
A simple backup solution is G4U. It is basically dd with compression. You can always use that and find the compression program and level that you wish to use if you insist on a bit by bit copy. Generally file by file copy is always faster and can be compressed too but you usually have to manually copy loader if you really need it. Gparted can easily copy partitions.
Just to start out, I am sticking with the backup program that is standard in Ubuntu 16.0.4, Deja-dup. I have a 2 TB external hard drive (rated for USB 3.0, clunking along with a 2.0 port connection, seems to work ok). The user interface (settings ... backup ... and so forth) are very easy to use. Once it starts up, you do not notice it taking up much processing power. I do get reports from it saying this file or that file could not be backed up, make sure you can open it. Which is rather nice, a secondary audit of the file system as it were. Deja dup likes to make "duplicity" copies of my user files. It does not like to deal with O/S files. As a former windows user, I found this a bit odd.
But, If I read the posts here correctly, Linux users should (1) backup all data files and (2) rather than doing the windows thing with system files (eg windows' "restore point" blah blah blah) just make sure you have a bootable usb handy, do your updates and upgrades every week, and if you crash, do a clean install, (3) even if you have not crashed, doing a clean install of the newest version of Linux about once a year is prudent practice.
But, If I read the posts here correctly, Linux users should (1) backup all data files and (2) rather than doing the windows thing with system files (eg windows' "restore point" blah blah blah) just make sure you have a bootable usb handy, do your updates and upgrades every week, and if you crash, do a clean install, (3) even if you have not crashed, doing a clean install of the newest version of Linux about once a year is prudent practice.
ad 1) Anyone should backup if he needs those files. Regardless what system they use
ad 2) Sounds a bit confusing. The choice of updates is the user choice. My relatives linux mint 17 was not updated and works. I just nuked it for linux mint 18.3 a few days ago.
With UEFI and it's microsoft bugs, it was designed by microsoft, there is a high chance that UEFI forgets the mappings of the bootloader when the "hdd" is removed and rebooted once, and other issues.
It is always a good idea to have a recovery media at hand like sysrescue_cd. And to know how to use it!
An upgrade every week may be too less and makes your box insecure for certain issues. Certain issues should be fixed faster as others. In my point of view kernel issues and browser issues should be fixed instantly. (e.g. recent discovered intel cpu bug which will cause soon a new windows bug fix, and also a new kernel release soon. It is not yet revealed which intel cpus are affected)
ad 3) That is newbie binary distro practise to reinstall. Spyware 95, aka windows95 upwards also recommended a reinstall every few months for certain reasons. Fragmentation, system lint files, lint files of software.
My gentoo installation is very old and was transplanted several times from hardware to hardware.
Because of hard drive costs I delete of course /var/log/messages and /var/log/emerge.log . Therefore the real age can not be determined anymore.
Just yesterday I moved my hole gentoo installation to a new SSD.
I fixed mayor screwups in the past several times. A decent distro can be fixed, and it is still worth fixing because you do not need to reconfigure. (I also know the other side with ubuntu based and arch linux distros where the package manager was at a state where a reinstall of another distro made sense after a few days try and fix, and reading some maybe helpful hints on the net)
I saw it recently with freebsd which refuses to at leaest boot properly and the test slackware installation.
Most annoying things to reconfigure on a reinstall or a new distro:
*) Network + wireless(wireless is a pain) (network tools have changed from ifconfig to stupid redhat ip tool)
*) GPU driver (that was always a hassle and will always be a hassle with nvidia and intel gpus and amd gpus)
*) keyboard layout
*) bootloader, partition layout, encryption
*) setup the users with correct user ids
*) install all needed software. A newbie most likely has to think what he used. hmm what was the card game called, forgot to write it down, because I thought I find it (i have around 1000-1200 packages in gentoo) (in theory i could grab the world file and reinstall that by hand)
e.g. linux mint 18.3 vista style start menu (do not ask me what newbie desktop environment it is): the package manager was different again, search feature is kinda annoying to get simple card games for the relative! Again changes to the user interface in regards on how to setup wireless, keymaps, how software is organised. The package manager is a mess for someone who just upgrades. It is like where could it maybe be.
*) setup a new backup plan
*) setup a new kernel
*) configure your home directory and any application in use.
*) reenter all passwords again.
*) setup ntp
*) I am kinda certain i missed some stuff also. I do not use firewall which I should because of nasty ntp packets incoming today
A reinstall is a pain, is much more time consuming as just fixing.
Now I may crasp the slackware philosophy to put so many packages in a single letter. You get a lot of unwanted crap but you safe maybe time searching for it. Also putting anything kde based on the box for sure reduces the dependency hell something is missing which I get on gentoo with bare ultra minimum install philosophy. gentoo kde maintainers also enforces you most of the time to install hole sets which is basically nearly the full desktop.
I prefer a distro where I can remove everything and have bare minimum. so my backups are small, less software brings less security issues. but sometimes also issues that dependencies are missing which are not noted in the installer ebuild.
--
The only thing i reinstall is wine. and I do rm -r .wine beforehand. Windows is a mess even in wine.
RH official documentation has things like tar,cpio, dump and AMANDA.
Don't forget that almost any command can usually be piped to another. tar combined with a compression type that matches your data and computing power will provide a compressed file.
Pretty sure Acronis can easily support most linux file systems and can backup linux.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.