LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 11-17-2005, 08:49 AM   #1
RedShirt
Senior Member
 
Registered: Oct 2005
Location: Denver
Distribution: Sabayon 3.5Loop2
Posts: 1,150

Rep: Reputation: 45
Defragging XFS


Having come from the world of Windows, and being minorly OC about data organization(you should see how neat all the folders and data storage is on my system...) I learned to defragment every week to two weeks on Windows. This was because even a minor fragmentation of 5% over the drive would hamper performance of read/writes by as much as 10%. When doing large things, like video editing and copying multiple gigs of files, this 10% could add up a ton. When the fily system was worse, it could hamper performance on Windows by as much as doubling load times and boot times.

That said, I have done my fair share of research about fragmentation and defragmenting on Linux. Many label it as "fragmentation repellent" or "immune," which hardly appears to be the case. It appears to handle data much better, but far from perfect. I also see many people claiming it is totally unnecessary to ever defragment any journaled filesystem under linux. I know it is far better at handling data storage than Windows, which is absolutely terrible, but nothing is perfect, especially given the size of today's files and how often they are accessed/moved. And given my drive is already a bit fragmented having done very little with media yet, and having seen many people with "df -h dev/hd*"s showing fragmentations of 20, 50, up to 99.3%, I can't believe Linux is even close to perfect. Now in terms of actual slow down, I can't say I have tested it much yet, but I will not believe massive fragmentations don't slow down the system, that is just common sense(and OS design). There is no way an OS can read hundreds of fragments of data as quickly as a single contiguous file.

I see many tools, including the xfs_fsr command built into xfsdump which can defragment an XFS partition. I have also seen a few other tools, inclduing utilities to defrag multiple parts. And if there was no desire for people to defrag, there wouldn't be tools.

So I have 3 questions:
1) How is it linux handles file writing that is supposedly so flawless?
2) Why the animosity against defragging in the linux community?
3) Why wouldn't I want to defrag my drives occasionally to keep file fragmentation in check?

Edit: a 4th actually, which tool is best to use?
 
Old 11-18-2005, 02:04 AM   #2
bigrigdriver
LQ Addict
 
Registered: Jul 2002
Location: East Centra Illinois, USA
Distribution: Debian stable
Posts: 5,908

Rep: Reputation: 356Reputation: 356Reputation: 356Reputation: 356
From my readings on this subject, it seems that the way files are written to disk is the essential difference between windows and Linux.

Windows writes files into non-contiguous locations on disk. Whereever it can find a space large enough to hold a piece of a file, that's where it will be written. Ergo, fragmentation is built-in.

Linux, in as much as is possible, writes files into contiguous locations. Ergo, no (or at least, less) fragmentation.

If Norton had a defrag utility which could work on both windows and Linux, on inspection of a windows partition, you'd see bits and pieces scattered about the disk. Inspection of a Linux partition would show entire files written to contiguous locations, with blank spaces between the files.

The next time a write-to-disk occurs, Linux will look for a contiguous space large enough to hold the file. So small files will be written to small, contiguous locations, and large files will be written to large, contiguous locations.

I don't know anything about XFS. But, if it's running on a Linux box, defrag should not be anymore necessary than with any other filesystem running in Linux. After all, it's using the same kernel.

Of course, people being the critters they are, will want a defrag utility, even when it's unnecessary (especiall if they are migrating from the windows world and don't know any better). And commercial vendors will develope and sell such utilities.
 
Old 12-31-2005, 12:41 PM   #3
RedShirt
Senior Member
 
Registered: Oct 2005
Location: Denver
Distribution: Sabayon 3.5Loop2
Posts: 1,150

Original Poster
Rep: Reputation: 45
I have done many tests with this now, over the last month+. I figured it was worth it to figure out for myself since no one else could tell me whether it was worth it or not. I got a lot of answers like that above that you never need to defrag in Linux cause it defeats the purpose of the OS or some such mularky. So in my tests, I am asuming a personal computer, cause this test really is for me. For servers, this test really wouldn't do much, because the files in use would be massively different than those I am using and so would the applications.

Steps:
Install xfs_dump toolset, xfs_progs does not have the right tools, which however, I also have installed and is available from the install discs, ftp.suse.com and other places. For those of you on SuSE like me:

first you need dmapi(the only prereq that wasn't installed for me)-
http://rpm.pbone.net/index.php3/stat....i586.rpm.html
then you need xfs_dump-
http://rpm.pbone.net/index.php3/stat....i586.rpm.html

Once installed, you can now access the tools you need, like xfs_frs(the actual defragging tool) and some tests including blocksize tests, fragmentation tests, etc.


Test setup:
I used the box I have, as why not, get a more real feel for use.
P4a 1.5
ECS Elitegroup P4ITA with Intel 850 chipset
768Mb rdram(2x256 PC600 and 2x128 PC800 all running at PC400, the board's max)
Brand new(okay like 2 or 3 months old now, but very newish) 80Gig WD HD
OpenSuSE, SuSE 10.0 with various Kernels from 2.6.14 to 2.6.15rc6
XFS file system
Radeon 9600XT(not that video matters)

During my tests, I tested the harddrive at various capacities with vairous file
sizes(mostly using various sizes of mp3s(small) about 4mb-8mb or medium videos 300mbish or large, a 4gig file for a windows installer of a large MMORPG.(duplicated many times when needed.

I figured it would be good enough to start where I was at a 13% capacity first. Real use with a few videos, some mp3s, and a bunch of applications and various documents and websites I am working on. fragmentation levels are basically nonexistent which is good, as that is how XFS and the linux kernel are supposed to act. and even after a defrag(which changed nothing) no performance gain(not shocking).

Going whole hog: 99.7% capacity, my 13% plus as many instances of the 4gig file as I could. Then doing many deletes, installs and uninstalls and reinstalls of software trying to tax it out, and recopies of the deleted files. This achieved what I call dramatic results. A 30% fragmentation rating. So I tried defragging, and not surprisingly it barely did anything because most files couldn't be moved given such a massive overuse and fill of the drive. I got it down to 28%, with no noticable performance gain.

Whole hog was obviously unrealistic, too many massive files, and filling it ALL the way up, which is really bad, and anyone actually doing it has problems of their own, and they caused theirs. So I deleted all the big useless files, and got back to my original, usable 13% and defragged, just to make the test fair. So now a more realistic 66% file with various videos and mp3s, and a bit of install/uninstall as a normal user might. This achieved a little bit of a shock, with barely any fragmentation at all, and no gain in fragmenting.

Conclusions:
So given a realistic usage of your computer, and a practical amount of space on the drive, you really are not off poorly from what I can tell in Linux with XFS. That said, people who are not... well let's just say people who aren't smart, and max out their drives storage, which again is really bad practice, can forcibly fragment the drive. But if you use the computer normally, at least over the minimal span of a month or two, it isn't that detrimental not to defrag. Over the span of a year, two, or more, has yet to be seen, but from my results, unless using biggity big files to purposely junk up the drive, no real problems should really arise.
So basically, assuming a normal, but good usage of your computer, it should be fine with or without defrag. The option is there, and in certain circumstances like the one I contrived, or according to some, P2P and other specific applications, you will need it. But for me, it is comforting knowing I can defrag should the need arise, but I am thrilled knowing the need probably won't.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
a round about way of defragging Linux, and at the same time, creating a system backup mrfixit1951 Linux - General 3 11-08-2005 06:29 AM
defragging? pharmd Linux - Newbie 2 02-27-2005 03:03 AM
Why does linux not need defragging? tearinox Linux - General 3 10-11-2004 06:59 PM
kdebase not recognizing XFree86-xfs package as xfs mikeyt_333 Linux - Software 1 08-17-2004 01:50 PM
2.4.22+xfs BedriddenTech Linux - Software 0 09-01-2003 06:30 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 04:53 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration