Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
This question has been asked several times on this forum. But in short no it does not. The disk algorithms employed by linux filesystems are a lot more efficient than NTFS or FAT. However linux will still do some disk checking with utilities such as fsck(file system checker) If you do a google search why doesn't linux fragment you will come across several sites explaining the gory details of this.
This question comes up literally all the time, and the discussion sometimes gets so heated that it's barely distinguishable from an argument. There is no defrag program on your Linux distro, nor is there an anti-virus (although there are root kit checkers). Why not? It's not because nobody thought to write one. It's because, in general, there's no need. Note the "in general" phrase, though. When your disk gets very full, it may tend toward fragmentation. The obvious solution, given the cost of disks today, is to just get a bigger disk.
The obvious solution, given the cost of disks today, is to just get a bigger disk.
Amen---have 1 (or 2) harddrives large enough so they will never get over 1/2 full. The filesystem will take care of the rest.
---never gets fragmented
---always room to make tarballs
---etc
It really could produce a state where it would need to be defragged. Almost no one ever does it. Everyone is conditioned to believe it can't happen as opposed to it is unlikely to happen.
It really could produce a state where it would need to be defragged. Almost no one ever does it. Everyone is conditioned to believe it can't happen as opposed to it is unlikely to happen.
There are tools and techniques to correct it.
*cough*
Everyone?
*cough*
That's how one starts "heated debates", eh? A little
exaggeration here, and off it goes :}
I find your post rather offensive and insulting to me, Tinkster.
The OP asked "Does Linux ever need "defragging"? "
Does it ever? YES it does.
A good administrator would be aware of conditions that may happen that would produce a fragmented drive and how to repair it. To say NO would be incorrect completely.
Anyone might notice the limiting comments I used to prevent over reaction.
To make it worse the other posts included one of the solutions to prevent it. There are others.
No well trained linux user would ever consider the linux OS to be free from the issues that plague other OS's. In fact if you want secure consider OpenBSD. Sad part of that is you can't add any tools as they tend to open holes.
Consider any computer connected to the internet may be subject to attacks and can not be considered safe from bad people.
The last years hackfest broke into the three major OS's in minutes.
It really could produce a state where it would need to be defragged. Almost no one ever does it. Everyone is conditioned to believe it can't happen as opposed to it is unlikely to happen.
Could you describe the state or states which cause a Linux filesystem to need defragging?
Situations are caused by the size of files added and removed in relation to the amount of free space. It could also happen with some files being locked for reasons. It can happen that as a drive gets full of many small files then they are deleted and larger files are added fragmentations gets out of hand. As to what is out of hand is subjective.
sourceforge defrag, simple tar to some place and back or almost any copy and back.
It is also true that the OS termed Linux has nothing to do with files are stored on a media. Even a tape while not having a format is a media.
It used to be that everyday you would tar the drive off to a tape and restore it. That way you put the files contiguous everyday, made a backup and tested it.
For Linux (well, ext3 anyway), it's rare to reach that point.
1. the FS algorithms are very good
2. the FS reserves 5% on all(?; only root?) partitions to avoid/minimize fragmentation.
Any system can get fragmented if continuously run at/or close to 100% of avail space.
Disks are very cheap these days (at least PC level ones).
The easiest way to defrag is a logical copy (not physical) to another disk eg cp not dd; then delete the originals and copy back from backup.
Of course, if the result is still near 100% used, it's only a matter of time before you hit the same issue, unless the files are read-only.
sourceforge defrag, simple tar to some place and back or almost any copy and back.
Is this the one you're thinking about? The last update was 4/11/2007, which should imply something about it. If it's the one I'm thinking about, it's not so much a defrag program as a simple user program that copies files around from here to there in the hope that the result can be called a defragmentation. I believe there's another one floating around out there that's only useful for ext2 filesystems, or ext3 if journaling is turned off.
Nonetheless, I have to point out, once again, that there is no defrag program in your distro's repository. That's a pretty strong statement about their utility.
OK, so I think we can agree that while it is rare it could happen. I'd go so far as to say an average user will NEVER see it on EXT3. A server farm though could run into issue if running at max. How many times does that happen?
Also, do not forget that Windows' own "NTFS" disk-format is also designed to run for years with no maintenance at all. The FAT-based file systems are more prone to it, but they were (literally) originally designed for floppy disks. (I know: I subscribed to BYTE Magazine at the time that the first discussion of it, and of the nascent system dubbed "MS-DOS," came out. )
Since disk drives are "friggin' huge" now, fragmentation is simply not much of a concern anymore. Their performance will always be "less than optimal," but the performance-curve tends to be self correcting; it doesn't dig a crater for itself.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.