LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 12-15-2009, 09:08 AM   #1
CRE
LQ Newbie
 
Registered: Oct 2007
Posts: 3

Rep: Reputation: 0
Does Linux ever need "defragging"?


When my employer-provided Windows pc slows down, tech support "defrags" it. Does Linux ever need "defragging"? If so, how do I do it?
 
Old 12-15-2009, 09:22 AM   #2
jstephens84
Senior Member
 
Registered: Sep 2004
Location: Nashville
Distribution: Manjaro, RHEL, CentOS
Posts: 2,098

Rep: Reputation: 102Reputation: 102
This question has been asked several times on this forum. But in short no it does not. The disk algorithms employed by linux filesystems are a lot more efficient than NTFS or FAT. However linux will still do some disk checking with utilities such as fsck(file system checker) If you do a google search why doesn't linux fragment you will come across several sites explaining the gory details of this.
 
Old 12-15-2009, 09:25 AM   #3
cpplinux
Member
 
Registered: Dec 2009
Posts: 37

Rep: Reputation: 17
No. Linux keeps fragments at a minimum so you don't need to worry about it.
 
Old 12-15-2009, 01:29 PM   #4
Quakeboy02
Senior Member
 
Registered: Nov 2006
Distribution: Debian Linux 11 (Bullseye)
Posts: 3,407

Rep: Reputation: 141Reputation: 141
This question comes up literally all the time, and the discussion sometimes gets so heated that it's barely distinguishable from an argument. There is no defrag program on your Linux distro, nor is there an anti-virus (although there are root kit checkers). Why not? It's not because nobody thought to write one. It's because, in general, there's no need. Note the "in general" phrase, though. When your disk gets very full, it may tend toward fragmentation. The obvious solution, given the cost of disks today, is to just get a bigger disk.
 
Old 12-15-2009, 01:51 PM   #5
pixellany
LQ Veteran
 
Registered: Nov 2005
Location: Annapolis, MD
Distribution: Mint
Posts: 17,809

Rep: Reputation: 743Reputation: 743Reputation: 743Reputation: 743Reputation: 743Reputation: 743Reputation: 743
Quote:
Originally Posted by Quakeboy02 View Post
The obvious solution, given the cost of disks today, is to just get a bigger disk.
Amen---have 1 (or 2) harddrives large enough so they will never get over 1/2 full. The filesystem will take care of the rest.
---never gets fragmented
---always room to make tarballs
---etc
 
Old 12-15-2009, 04:26 PM   #6
jefro
Moderator
 
Registered: Mar 2008
Posts: 21,982

Rep: Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626
It really could produce a state where it would need to be defragged. Almost no one ever does it. Everyone is conditioned to believe it can't happen as opposed to it is unlikely to happen.

There are tools and techniques to correct it.
 
0 members found this post helpful.
Old 12-15-2009, 04:43 PM   #7
Tinkster
Moderator
 
Registered: Apr 2002
Location: earth
Distribution: slackware by choice, others too :} ... android.
Posts: 23,067
Blog Entries: 11

Rep: Reputation: 928Reputation: 928Reputation: 928Reputation: 928Reputation: 928Reputation: 928Reputation: 928Reputation: 928
Quote:
Originally Posted by jefro View Post
It really could produce a state where it would need to be defragged. Almost no one ever does it. Everyone is conditioned to believe it can't happen as opposed to it is unlikely to happen.

There are tools and techniques to correct it.
*cough*

Everyone?

*cough*

That's how one starts "heated debates", eh? A little
exaggeration here, and off it goes :}




Cheers,
Tink
 
Old 12-15-2009, 05:42 PM   #8
jefro
Moderator
 
Registered: Mar 2008
Posts: 21,982

Rep: Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626
I find your post rather offensive and insulting to me, Tinkster.

The OP asked "Does Linux ever need "defragging"? "

Does it ever? YES it does.

A good administrator would be aware of conditions that may happen that would produce a fragmented drive and how to repair it. To say NO would be incorrect completely.


Anyone might notice the limiting comments I used to prevent over reaction.

To make it worse the other posts included one of the solutions to prevent it. There are others.

No well trained linux user would ever consider the linux OS to be free from the issues that plague other OS's. In fact if you want secure consider OpenBSD. Sad part of that is you can't add any tools as they tend to open holes.


Consider any computer connected to the internet may be subject to attacks and can not be considered safe from bad people.

The last years hackfest broke into the three major OS's in minutes.

Last edited by jefro; 12-15-2009 at 05:49 PM.
 
Old 12-15-2009, 05:45 PM   #9
Quakeboy02
Senior Member
 
Registered: Nov 2006
Distribution: Debian Linux 11 (Bullseye)
Posts: 3,407

Rep: Reputation: 141Reputation: 141
Quote:
Originally Posted by jefro View Post
It really could produce a state where it would need to be defragged. Almost no one ever does it. Everyone is conditioned to believe it can't happen as opposed to it is unlikely to happen.
Could you describe the state or states which cause a Linux filesystem to need defragging?

Quote:
There are tools and techniques to correct it.
Could you list them?
 
Old 12-15-2009, 05:45 PM   #10
Bratmon
Member
 
Registered: Jul 2009
Location: 75.126.162.205:80
Distribution: Arch / Mint 17
Posts: 297
Blog Entries: 3

Rep: Reputation: 50
Obligatory terminology nitpicking:

I don't think anyone's ever needed to defrag an OS, ever.

A filesystem on the other hand...
 
Old 12-15-2009, 05:55 PM   #11
jefro
Moderator
 
Registered: Mar 2008
Posts: 21,982

Rep: Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626
Situations are caused by the size of files added and removed in relation to the amount of free space. It could also happen with some files being locked for reasons. It can happen that as a drive gets full of many small files then they are deleted and larger files are added fragmentations gets out of hand. As to what is out of hand is subjective.


sourceforge defrag, simple tar to some place and back or almost any copy and back.

It is also true that the OS termed Linux has nothing to do with files are stored on a media. Even a tape while not having a format is a media.

It used to be that everyday you would tar the drive off to a tape and restore it. That way you put the files contiguous everyday, made a backup and tested it.

Last edited by jefro; 12-15-2009 at 06:08 PM.
 
Old 12-15-2009, 05:58 PM   #12
chrism01
LQ Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.2
Posts: 18,359

Rep: Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751
For Linux (well, ext3 anyway), it's rare to reach that point.

1. the FS algorithms are very good
2. the FS reserves 5% on all(?; only root?) partitions to avoid/minimize fragmentation.

Any system can get fragmented if continuously run at/or close to 100% of avail space.
Disks are very cheap these days (at least PC level ones).
The easiest way to defrag is a logical copy (not physical) to another disk eg cp not dd; then delete the originals and copy back from backup.
Of course, if the result is still near 100% used, it's only a matter of time before you hit the same issue, unless the files are read-only.
 
Old 12-15-2009, 06:07 PM   #13
Quakeboy02
Senior Member
 
Registered: Nov 2006
Distribution: Debian Linux 11 (Bullseye)
Posts: 3,407

Rep: Reputation: 141Reputation: 141
Quote:
Originally Posted by jefro View Post
sourceforge defrag, simple tar to some place and back or almost any copy and back.
Is this the one you're thinking about? The last update was 4/11/2007, which should imply something about it. If it's the one I'm thinking about, it's not so much a defrag program as a simple user program that copies files around from here to there in the hope that the result can be called a defragmentation. I believe there's another one floating around out there that's only useful for ext2 filesystems, or ext3 if journaling is turned off.

Nonetheless, I have to point out, once again, that there is no defrag program in your distro's repository. That's a pretty strong statement about their utility.

http://sourceforge.net/projects/defragfs/
 
Old 12-15-2009, 06:11 PM   #14
jefro
Moderator
 
Registered: Mar 2008
Posts: 21,982

Rep: Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626Reputation: 3626
OK, so I think we can agree that while it is rare it could happen. I'd go so far as to say an average user will NEVER see it on EXT3. A server farm though could run into issue if running at max. How many times does that happen?

Last edited by jefro; 12-15-2009 at 08:42 PM.
 
Old 12-15-2009, 08:05 PM   #15
sundialsvcs
LQ Guru
 
Registered: Feb 2004
Location: SE Tennessee, USA
Distribution: Gentoo, LFS
Posts: 10,659
Blog Entries: 4

Rep: Reputation: 3941Reputation: 3941Reputation: 3941Reputation: 3941Reputation: 3941Reputation: 3941Reputation: 3941Reputation: 3941Reputation: 3941Reputation: 3941Reputation: 3941
Also, do not forget that Windows' own "NTFS" disk-format is also designed to run for years with no maintenance at all. The FAT-based file systems are more prone to it, but they were (literally) originally designed for floppy disks. (I know: I subscribed to BYTE Magazine at the time that the first discussion of it, and of the nascent system dubbed "MS-DOS," came out. )

Since disk drives are "friggin' huge" now, fragmentation is simply not much of a concern anymore. Their performance will always be "less than optimal," but the performance-curve tends to be self correcting; it doesn't dig a crater for itself.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
i/p="3MMTLYX4JQGRS87C-8357LYBB2";o/p="3MMTL-YX4JQ-GRS87-C8357-LYBB2";linux pankajd Linux - Newbie 1 04-13-2009 12:24 PM
Standard commands give "-bash: open: command not found" even in "su -" and "su root" mibo12 Linux - General 4 11-11-2007 10:18 PM
LXer: Displaying "MyComputer", "Trash", "Network Servers" Icons On A GNOME Desktop LXer Syndicated Linux News 0 04-02-2007 08:31 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 02:10 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration