LinuxQuestions.org
Help answer threads with 0 replies.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 03-10-2013, 01:37 AM   #1
jcCampbell
LQ Newbie
 
Registered: Mar 2013
Posts: 1

Rep: Reputation: Disabled
scan disc/defrag


How do I defragment my linux ubuntu 10.10?
 
Old 03-10-2013, 04:05 AM   #2
jpollard
Senior Member
 
Registered: Dec 2012
Location: Washington DC area
Distribution: Fedora, CentOS, Slackware
Posts: 4,912

Rep: Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513
You don't need to.

Linux native filesystems either don't require defragging (% defrag is usually less than 5), or defrag in the background (btrfs/ext4).
 
Old 03-10-2013, 04:15 AM   #3
tommcd
Senior Member
 
Registered: Jun 2006
Location: Philadelphia PA USA
Distribution: Lubuntu, Slackware
Posts: 2,230

Rep: Reputation: 293Reputation: 293Reputation: 293
Linux file systems should not need to be defragmented in most cases. The only exception seems to be if the disk partition is almost full.
Some references:
http://askubuntu.com/questions/1090/...on-unnecessary
http://www.howtogeek.com/115229/htg-...defragmenting/

There are in fact tools that you can use to defrag a hard disk on Linux if you have a burning desire to use them:
http://www.hecticgeek.com/2012/10/de...defrag-ubuntu/
I have been using Linux for 7 years. I have 2 hard drives in my desktop computer. The first is for my Linux operating systems. The second is for my data.
The second hard drive is formatted in ext3 and has been in use for several years.
Although the second hard drive is well past it's prime, I have never noticed any decrease in performance in all the time that I have been using it. It has never been defragmented.

Write back if you need more help.
And welcome to the LQ forums!

EDIT: After posting I noticed that jpollard posted his answer as I was composing my own answer. Anyway, this only serves to confirm what I have posted here.

Last edited by tommcd; 03-10-2013 at 04:23 AM.
 
Old 03-10-2013, 12:04 PM   #4
jefro
Moderator
 
Registered: Mar 2008
Posts: 21,978

Rep: Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624
To defrag, the old way was to use a command to put all the data to a tape drive. Next morning, you'd put it back. That process put all the files back in order. There are a few commands still that can be used to defrag. 90% of people say you don't need to, that may be true. Some server admins may have to do it once in a while on heavy use systems. Their systems need to be fast, and have optimum performance.


Scandisk is the match to fsck or such for each type of filesystem in use. It is very important to know and learn how to use.
 
Old 03-10-2013, 02:09 PM   #5
TobiSGD
Moderator
 
Registered: Dec 2009
Location: Germany
Distribution: Whatever fits the task best
Posts: 17,148
Blog Entries: 2

Rep: Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886
Offtopic, but anyways: Ubuntu 10.10 is not supported anymore since April 2012, you should upgrade to a supported version, so that you get security updates and bugfixes.
 
Old 03-11-2013, 02:18 AM   #6
jpollard
Senior Member
 
Registered: Dec 2012
Location: Washington DC area
Distribution: Fedora, CentOS, Slackware
Posts: 4,912

Rep: Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513
Quote:
Originally Posted by jefro View Post
To defrag, the old way was to use a command to put all the data to a tape drive. Next morning, you'd put it back. That process put all the files back in order. There are a few commands still that can be used to defrag. 90% of people say you don't need to, that may be true. Some server admins may have to do it once in a while on heavy use systems. Their systems need to be fast, and have optimum performance.
In almost 20 years of UNIX/Linux administration, I haven't seen ANY administrator need to defrag a disk (not since Sun OS 3.2), as long as their disks were properly sized. If you need to defrag (even on heavily used systems) you have failed - because the downtime/lost time due to the defrag only makes the system more heavily used.

Quote:
Scandisk is the match to fsck or such for each type of filesystem in use. It is very important to know and learn how to use.
Last time I checked (my wife has to use it) it could take over 10 hours to defrag a NTFS disk, and during that time the system was effectively unusable.
 
Old 03-11-2013, 03:30 PM   #7
jefro
Moderator
 
Registered: Mar 2008
Posts: 21,978

Rep: Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624
I guess we all ought to welcome the jcCampbell to LQ. I also agree that 10.10 may be a poor choice to be using.


You are more than welcome to do as you wish. Like I have said, many an admin has used tar to a tape every night and many an admin had looked at their system indept for issues. I have used tools to reduce fragmentation on heavy use systems. I have used these methods for decades on mainframe and bsd/linux. The defrag issue was all solved by daily tape backups on big and small systems. It was at one time the only proper MS solution to it's server products.

So saying it takes 10 hours so you don't do it? I'd never wait that long either. I'd reload the OS or get more storage. If one did wait long enough to run scandisk and if they did render it useless they'd have much more issues. You ought to run scandisk more often on a schedule. Could it be that your disk is more than 70% full or is the system so old or full of spyware or such?

As a common user, one doesn't usually need to contend with any defrag on newer ext4 filesystems. Ext2 may not be so forgiving.
 
Old 03-11-2013, 05:50 PM   #8
jpollard
Senior Member
 
Registered: Dec 2012
Location: Washington DC area
Distribution: Fedora, CentOS, Slackware
Posts: 4,912

Rep: Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513
Quote:
Originally Posted by jefro View Post
...
You are more than welcome to do as you wish. Like I have said, many an admin has used tar to a tape every night and many an admin had looked at their system indept for issues. I have used tools to reduce fragmentation on heavy use systems. I have used these methods for decades on mainframe and bsd/linux. The defrag issue was all solved by daily tape backups on big and small systems. It was at one time the only proper MS solution to it's server products.
left out the "restore" step in defragging...

Defragging fails when you have a filesystem with 300+ TB of disk. So does a backup/restore as it takes a day (or longer) to do just a 16TB filesystem (a small filesystem as far as UNIX servers go). One AIX server I worked with had 10TB just for /tmp, but 3 30TB filesystems for other data (the OS was separate and I don't remember how big that was, but it wasn't very large - 10/15 GB fro each node or thereabouts). Backups of system files, yes... but only because the system configuration was on a configurations server, and system files could be reinstalled from an install server faster (about 30 minutes, and one install server could handle 24 nodes simultaneously). Updates to the system were relatively easy (at least for me, I didn't have to do them - the IBM CE did but the updates were applied to console system (a special node), which then pushed the update to a collection of install servers. Then those servers would update their list of nodes. It could take about 8 hours for a full install - as I recall only 12 nodes were designated as install servers, and they had to wait for the filesystem servers to be updated before the rest of the nodes received their updates. With a total of 310 nodes (10 were file servers) there was a delay while groups of 24 nodes were updated at a time. And the filesystems could take a while to be checked (jfs was pretty good at it). But defrag? never. Just add enough disks (or delete enough files to get 20-30% free space) and any fragmentation would take care of itself.

Quote:
So saying it takes 10 hours so you don't do it? I'd never wait that long either. I'd reload the OS or get more storage. If one did wait long enough to run scandisk and if they did render it useless they'd have much more issues. You ought to run scandisk more often on a schedule. Could it be that your disk is more than 70% full or is the system so old or full of spyware or such?
No - doesn't work. She does a defrag roughly once a month but leaves it running overnight, and a scandisk when things get really slow. As I recall from the last run, the disk is only about 60% used normally.

Quote:
As a common user, one doesn't usually need to contend with any defrag on newer ext4 filesystems. Ext2 may not be so forgiving.
Ext2 was just as forgiving, though the fragmentation level could be between 7-10%, with 80% used. I have never needed to defrag since Ext2. Performance was excellent, and the observed fragmentation was recovered just by deleting a few files. When the free space was reclaimed, the tail end of files would also be repacked, so partial sector allocations would get reclaimed by packing tail end of files together. Files could get fragmented, a bit, but new files would not be a problem.
 
Old 03-11-2013, 08:48 PM   #9
jefro
Moderator
 
Registered: Mar 2008
Posts: 21,978

Rep: Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624
Plenty of examples on IBM AIX and defrag.
 
Old 03-12-2013, 06:57 AM   #10
jpollard
Senior Member
 
Registered: Dec 2012
Location: Washington DC area
Distribution: Fedora, CentOS, Slackware
Posts: 4,912

Rep: Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513
Quote:
Originally Posted by jefro View Post
Plenty of examples on IBM AIX and defrag.
Including a number that indicate it won't work with certain usages of filesystems.

Most appear to only be applicable to relatively small static filesystems to optimize rotational delay - which doesn't work well with logical volume use. In fact, defragmentation could destroy the performance of a logical volume - by overloading a single part. Fragmenting a file among the underlying volumes would then improve performance.
 
Old 03-12-2013, 08:53 AM   #11
TobiSGD
Moderator
 
Registered: Dec 2009
Location: Germany
Distribution: Whatever fits the task best
Posts: 17,148
Blog Entries: 2

Rep: Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886
Quote:
Originally Posted by jpollard View Post
Last time I checked (my wife has to use it) it could take over 10 hours to defrag a NTFS disk, and during that time the system was effectively unusable.
Use a different defrag program, for example O&O Defrag is able to do defragging in the background without noticeable impacts on performance.
 
Old 03-12-2013, 09:47 AM   #12
jpollard
Senior Member
 
Registered: Dec 2012
Location: Washington DC area
Distribution: Fedora, CentOS, Slackware
Posts: 4,912

Rep: Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513
O&O just adds more expense to an already overpriced software... and may not be compatible with MS updates.

But that is just us. It may be worth it to someone else.
 
Old 03-12-2013, 12:52 PM   #13
TobiSGD
Moderator
 
Registered: Dec 2009
Location: Germany
Distribution: Whatever fits the task best
Posts: 17,148
Blog Entries: 2

Rep: Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886
O&O defrag has a free version for private use, so no expense is added. I never had problems with updates with that software, but of course it is up to you if you want use it.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
installed print-scan prints beautifully but will not scan Philippe Lousberg Linux - Wireless Networking 1 03-07-2013 10:27 AM
Windows equivalent of disc scan and error repair? mirchichamu Linux - Newbie 5 08-22-2010 03:33 PM
How to force a CDROM drive to re-scan a disc ajparr Programming 0 02-14-2008 09:37 AM
What command to type on frozen run scan disc manually page jerryjg Mandriva 5 01-02-2006 08:53 PM
k3b hangs when trying to burn dvd+rw disc but not cd rw disc laiboonh Slackware 0 11-06-2004 11:22 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 02:55 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration