Does Linux ever need "defragging"?
When my employer-provided Windows pc slows down, tech support "defrags" it. Does Linux ever need "defragging"? If so, how do I do it?
|
This question has been asked several times on this forum. But in short no it does not. The disk algorithms employed by linux filesystems are a lot more efficient than NTFS or FAT. However linux will still do some disk checking with utilities such as fsck(file system checker) If you do a google search why doesn't linux fragment you will come across several sites explaining the gory details of this.
|
No. Linux keeps fragments at a minimum so you don't need to worry about it.
|
This question comes up literally all the time, and the discussion sometimes gets so heated that it's barely distinguishable from an argument. :) There is no defrag program on your Linux distro, nor is there an anti-virus (although there are root kit checkers). Why not? It's not because nobody thought to write one. It's because, in general, there's no need. Note the "in general" phrase, though. When your disk gets very full, it may tend toward fragmentation. The obvious solution, given the cost of disks today, is to just get a bigger disk.
|
Quote:
---never gets fragmented ---always room to make tarballs ---etc |
It really could produce a state where it would need to be defragged. Almost no one ever does it. Everyone is conditioned to believe it can't happen as opposed to it is unlikely to happen.
There are tools and techniques to correct it. |
Quote:
Everyone? *cough* That's how one starts "heated debates", eh? A little exaggeration here, and off it goes :} Cheers, Tink |
I find your post rather offensive and insulting to me, Tinkster.
The OP asked "Does Linux ever need "defragging"? " Does it ever? YES it does. A good administrator would be aware of conditions that may happen that would produce a fragmented drive and how to repair it. To say NO would be incorrect completely. Anyone might notice the limiting comments I used to prevent over reaction. To make it worse the other posts included one of the solutions to prevent it. There are others. No well trained linux user would ever consider the linux OS to be free from the issues that plague other OS's. In fact if you want secure consider OpenBSD. Sad part of that is you can't add any tools as they tend to open holes. Consider any computer connected to the internet may be subject to attacks and can not be considered safe from bad people. The last years hackfest broke into the three major OS's in minutes. |
Quote:
Quote:
|
Obligatory terminology nitpicking:
I don't think anyone's ever needed to defrag an OS, ever. A filesystem on the other hand... |
Situations are caused by the size of files added and removed in relation to the amount of free space. It could also happen with some files being locked for reasons. It can happen that as a drive gets full of many small files then they are deleted and larger files are added fragmentations gets out of hand. As to what is out of hand is subjective.
sourceforge defrag, simple tar to some place and back or almost any copy and back. It is also true that the OS termed Linux has nothing to do with files are stored on a media. Even a tape while not having a format is a media. It used to be that everyday you would tar the drive off to a tape and restore it. That way you put the files contiguous everyday, made a backup and tested it. |
For Linux (well, ext3 anyway), it's rare to reach that point.
1. the FS algorithms are very good 2. the FS reserves 5% on all(?; only root?) partitions to avoid/minimize fragmentation. Any system can get fragmented if continuously run at/or close to 100% of avail space. Disks are very cheap these days (at least PC level ones). The easiest way to defrag is a logical copy (not physical) to another disk eg cp not dd; then delete the originals and copy back from backup. Of course, if the result is still near 100% used, it's only a matter of time before you hit the same issue, unless the files are read-only. |
Quote:
Nonetheless, I have to point out, once again, that there is no defrag program in your distro's repository. That's a pretty strong statement about their utility. http://sourceforge.net/projects/defragfs/ |
OK, so I think we can agree that while it is rare it could happen. I'd go so far as to say an average user will NEVER see it on EXT3. A server farm though could run into issue if running at max. How many times does that happen?
|
Also, do not forget that Windows' own "NTFS" disk-format is also designed to run for years with no maintenance at all. The FAT-based file systems are more prone to it, but they were (literally) originally designed for floppy disks. (I know: I subscribed to BYTE Magazine at the time that the first discussion of it, and of the nascent system dubbed "MS-DOS," came out. :hattip:)
Since disk drives are "friggin' huge" now, fragmentation is simply not much of a concern anymore. Their performance will always be "less than optimal," but the performance-curve tends to be self correcting; it doesn't dig a crater for itself. |
All times are GMT -5. The time now is 09:43 PM. |