LinuxQuestions.org
LinuxAnswers - the LQ Linux tutorial section.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices

Reply
 
Search this Thread
Old 05-16-2011, 05:50 AM   #16
H_TeXMeX_H
Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
Blog Entries: 2

Original Poster
Rep: Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269

So here's one way to compile a list of files on your computer along with the number of extents they take up:

PHP Code:
#!/bin/sh
# finds fragmented files

# outputfile, this MUST be an absolute path or it will generate it in every directory
output=~/.defrag

# make sure we are root
if test ~ != /root
then
  
echo 'Error: you must be root in order to run this script !'
  
# fail
  
exit 1
fi

# find files
cd /
for 
i in bin boot etc home lib lib64 opt root sbin usr var
do
    
cd "$i"
    
find . -type f -print0 xargs -0 filefrag >> "$output"
    
cd ..
done

echo "Ouput written to $output"

exit 
However, I can't figure out an efficient way to sort the output. Here's example output:

Code:
./spool/mail/demonslayer: 0 extents found
./spool/mail/root: 2 extents found
./spool/slrnpull/slrnpull.conf: 1 extent found
./state/dhcp/dhclient.leases: 0 extents found
./state/dhcp/dhclient6.leases: 0 extents found
./state/dhcp/dhcpd.leases: 0 extents found
./state/dhcp/dhcpd6.leases: 0 extents found
./tmp/alsaconf.cards: 1 extent found
./www/cgi-bin/htsearch: 3 extents found
./www/cgi-bin/qtest: 2 extents found
./www/htdocs/htdig/bad_words: 1 extent found
./www/htdocs/htdig/button1.gif: 1 extent found
./www/htdocs/htdig/button1.png: 1 extent found
./www/htdocs/htdig/button10.gif: 1 extent found
./www/htdocs/htdig/button10.png: 1 extent found
I've tried sort, but it doesn't always work right, because some files may have spaces in them.
 
Old 05-16-2011, 08:26 AM   #17
sundialsvcs
Guru
 
Registered: Feb 2004
Location: SE Tennessee, USA
Distribution: Gentoo, LFS
Posts: 5,256

Rep: Reputation: 1076Reputation: 1076Reputation: 1076Reputation: 1076Reputation: 1076Reputation: 1076Reputation: 1076Reputation: 1076
I cordially suggest that you are trying to solve a "problem" that really doesn't need solving.

Fragmentation was a genuine problem when disk drives were much smaller than they are today; and, much slower. Operating systems (e.g. MS-DOS, having been originally designed for floppy-disk drives and computers having no RAM to spare, were not terribly sophisticated about file placement and disk-I/O, and, quite frankly, could not then afford to be.

These days, yes, you might discover that "OMG, this drive is really fragmented!" But ... is that making a pragmatic, measurable, demonstrable impact on actual production performance? If it isn't (and it probably isn't...), then, "Houston, we do not have a problem."

Last edited by sundialsvcs; 05-16-2011 at 08:28 AM.
 
Old 05-16-2011, 09:13 AM   #18
H_TeXMeX_H
Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
Blog Entries: 2

Original Poster
Rep: Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269
Yes, so far, it doesn't seem to be a problem. However, it would be cool to make a defrag script to find heavily fragmented files and copy them to fix it. I'm almost there, I just need to sort the output efficiently. I'll get to it later because I'm busy now, but if someone can help it would be great.
 
Old 05-16-2011, 10:43 AM   #19
catkin
LQ 5k Club
 
Registered: Dec 2008
Location: Tamil Nadu, India
Distribution: Servers: Debian Squeeze and Wheezy. Desktop: Slackware64 14.0. Netbook: Slackware 13.37
Posts: 8,528
Blog Entries: 27

Rep: Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176
Quote:
Originally Posted by H_TeXMeX_H View Post
Yes, so far, it doesn't seem to be a problem. However, it would be cool to make a defrag script to find heavily fragmented files and copy them to fix it. I'm almost there, I just need to sort the output efficiently. I'll get to it later because I'm busy now, but if someone can help it would be great.
Maybe something like:
Code:
cat $output | awk '{printf("%s %s\n",$(NF - 2),substr($0,1,match($0,/: [0-9]* extents? found/)-2))}' | sort > $output.sorted
 
1 members found this post helpful.
Old 05-16-2011, 10:54 AM   #20
H_TeXMeX_H
Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
Blog Entries: 2

Original Poster
Rep: Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269
It almost works, but there were a few bugs, so here's what I will use:

Code:
cat defrag | awk '{printf("%s %s\n",$(NF - 2),substr($0,1,match($0,/: [0-9]* extents? found/)-1))}' | sort -n
 
Old 05-16-2011, 02:38 PM   #21
H_TeXMeX_H
Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
Blog Entries: 2

Original Poster
Rep: Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269
Here's the root script I wrote, it seems to work ok:

DISCLAIMER: It is dangerous to run scripts as root, especially ones that make changes. This script is released as is, and there is no warranty whatsoever.

Also, do NOT run this script when the disk is almost full. It will not only not work, but strange things may happen.

PHP Code:
#!/bin/sh
# finds fragmented files

# make sure we are root
if test ~ != /root
then
    
echo 'Error: you must be root in order to run this script !'
    
# fail
    
exit 1
fi

# make sure we have 1 argument
if test $# != 1
then
  
echo "Usage: $(basename $0) option"
  
echo 'option can be:'
  
echo 'find'
  
echo 'defrag'
  
# fail
  
exit 1
fi

# places
dir=/root/.defrag
output
="$dir"/output
sorted
="$dir"/sorted
defrag
="$dir"/defrag

if test "$1" == "find"
then
    
# generate cleanly
    
if test ! -"$dir"
    
then
        rm 
-"$dir"
        
mkdir "$dir"
    
fi
    
"$output"

    
# find files in /
    
for i in bin boot etc lib lib64 opt root sbin usr var
    do
        
find "/$i" -type f -print0 xargs -0 filefrag >> "$output"
    
done

    
# parse
    
awk '{printf("%s|%s\n",$(NF - 2),substr($0,1,match($0,/: [0-9]* extents? found/)-1))}' "$output" sort -"$sorted"

    
echo "Ouput written to $output"
    
echo "Sorted output written to $sorted"
    
echo
    echo 
"Most fragmented files"
    
tail "$sorted"

elif test "$1" == "defrag"
then
    awk 
-F'|' '{ if ($1>9) print $2}' "$sorted" "$defrag"
    
cat "$defrag"
    
echo
    echo 
"Do you want to defrag these files ? [y/n]"
    
read answer
    
case "$answer" in
        y
|Y)
            while 
read line
            
do
                
cp "$line" "$line.defrag"
                
mv "$line.defrag" "$line"
            
done "$defrag"
        
;;
        
n|N)
            echo 
'exiting'
            
exit 0
        
;;
        *)
            echo 
'ERROR: bad input'
            
exit 1
        
;;
    
esac
else
    echo 
'ERROR: input is not sane'
    
exit 1
fi

exit 
NOTE: The defrag part of this script has variable results. If a file is large it will not really reduce the number of extents. You have to change the minimum value manually for now. I wrote this in about half an hour or so. Don't expect it not to have bugs.

There's also two more scripts for doing this as a user and not messing up permissions, I'll release those if I can't find a better way to do it.

Last edited by H_TeXMeX_H; 05-17-2011 at 03:26 AM.
 
Old 05-25-2011, 07:36 AM   #22
H_TeXMeX_H
Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
Blog Entries: 2

Original Poster
Rep: Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269
It seems to run ok, now does anyone have an idea on how to make the 'cp' and 'mv' steps safer against stopping the script at the wrong time ? Should I use nohup ? or something else ?
 
Old 06-12-2011, 07:50 AM   #23
H_TeXMeX_H
Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
Blog Entries: 2

Original Poster
Rep: Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269
Ok, I added support for a threshold, where you can specify the threshold number of extents above which it will defragment files. I've also made local user scripts so that they don't mess up permissions. Make sure to change the scripts as needed to work on your system (especially your home directory). I uploaded them to my site:
http://htexmexh.my3gb.com/linux/scripts.html

I guess I'll mark this solved now.
 
Old 11-03-2011, 08:33 AM   #24
H_TeXMeX_H
Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
Blog Entries: 2

Original Poster
Rep: Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269
There is a problem with the script that I did not take into account. If you use lilo, do NOT include the boot directory in the directories for loop, so change:

Code:
    for i in bin boot etc lib lib64 opt root sbin usr var
to
Code:
    for i in bin etc lib lib64 opt root sbin usr var
This is because lilo only knows the offset of the kernel image on the disk, if you move the kernel image = vmlinuz, and you don't run 'lilo' to update the offset, you won't be able to boot into Linux. This is what just happened to me. Easy fix is, of course, to just boot your install disk, mount and chroot into the partition and run 'lilo', then restart. Still, it is a bug, I will update my site later today. You can keep the directory there if you remember to run lilo if the kernel image is moved. Grub should NOT have this problem.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
copy sparse files tincboy Linux - Newbie 14 07-07-2010 01:47 AM
MLDonkey/mlnet: creating sparse files - undesirable alexander_bosakov Linux - Software 0 02-26-2008 04:10 PM
disk defrag and temp files james2b Linux - General 27 09-03-2007 08:17 AM
reserving space on the disk for sparse files madhukirant Programming 1 08-17-2005 07:29 PM
defrag/temp files mindseye Linux - Newbie 1 01-18-2004 11:28 AM


All times are GMT -5. The time now is 09:07 PM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration