Something's filling up my disk, but I don't know what it is.
My work computer has been running for nearly 2 years now. It's running Ubuntu 8.04 right now. For me it's a good lesson in what happens to installs over the long term.
I just started getting "disk full" errors, and I don't know what could be causing it. First I run df: Code:
bob@0-13-72-d0-a-5:~$ df -lh Code:
bob@0-13-72-d0-a-5:/$ ls -l How can I figure out what is using many gigabytes of space on my system? I don't even see where it's coming from right now, so that I can delete it. |
My first guess is that you have an auto backup running. They are very good at filling up the nooks and crannies.
To find out what files are using the space use du My favourite way ahead is: 1. su to become root. BE VERY CAREFUL FROM THIS POINT ON 2. cd / to go to the root of the file system 3. du | sort -rn > /tmp/greedy.txt to create a text file for browsing with less, or a text editor OR du | sort -rn | less which allows you to scroll through the result immediately. Why sort -rn ? n makes the sort numerical, not alpha-numerical, so that 4 is treated as a smaller number than 10. r reverses the sort order, putting the biggest first. So you will be able to track the really greedy files. Hope that helps. And DON'T FORGET TO EXIT FROM THE root LOGIN |
Hi,
Better to use the command du to show disk usage. Examples: Code:
du -sh /* Code:
du -sh /* > sizes-summery The 4096 shown with ls is the size necessary to store the meta-data about files, including the file names contained within the directory. Since everything is mounted on /dev/sda5 it could be system files or user files under /home that are consuming the space. Regards Ian |
Your hard disk may be full up by log files. Have you checked it?
A beautiful command to sort space usage in decrease order: su - cd / du -s * | sort -rn | cut -f2 | xargs -d '\n' du -sh You can install a monitoring tool to monitor your hard disk and determine when the disk is full up. |
In general, I find my Ubuntu systems fill up with trash more than anything else. But this might just be me never quite adjusting to the fact that deleting something doesn't delete it... Look for a hidden directory .Trash1000 in the root of any volumes.
After that, my first ports of call would be /var/log and /home . |
OT, but as per your signature, the following would be better phrased "to sort space usage in descending order", but more ideally would be "to sort files in descending order of disk space used" or "to sort files according to disk space used, in descending order".
Quote:
|
There is also a graphical program that can perform similar functions. If you're using GNOME, I think it's called Disk Usage Analyzer or something similar.
|
Linux treats everything - including directories - as files. The 4096 you are seeing is the default directory size on disk. If the table of contents for the directory contents fits into the 4096 bytes (most do), then 4096 is what you will see. As the directory grows with more and more files, more and more space will be allocated for the TOC, and the size you see for the directory will grow.
From the command line, execute this command: sudo find / -type d -size +100000c This will search your hard drive for any directories whose file descriptor is over 100000 bytes (this would be a BIG directory with lots of files. Typically, /usr, /bin, and /lib will be that big. Also possibly /usr/lib or /usr/local/lib. You may find something in /var/cache that is that big; if so take a look at it; nothing there should have that many files in it and if it does, you have found your problem. Basically, look at any directory that is that big. Someplace, you just might find a directory that has a million files in it because a misconfiguration in, for instance, your mail daemon has caused an email to be generated every 60 seconds for the last two years but - because that email couldn't be sent due to a misconfig - those emails have accumulated for that entire time. This, BTW, is the exact situation I have encountered in the past...with over a million files in the directory. Then, run this command: sudo find / -type f -size +5000000c This will identify all files on your drive that are over 5 million bytes in size. You are looking for log files or error files, or core dump files, or pretty much anything else. Check out all candidates, and respond appropriately. Often - usually - deleting is the thing to do. Pay particular attention to .xsession-errors that will exist in home directories; if you remain logged in for long periods, that file can get HUGE if you have some program running that likes to write to the log. Play with the numbers and the file sizes you use for your thresholds. You'll find the problem; it is there someplace. Just yesterday, I did this on my system drive and freed 1.7 Gigs on my system partition when I discovered some log files that were growing forever that I had not known about. |
I'd go with jiml8's answer if I were you. Sounds pretty good.
|
Also check if du and df give very different results. If so, there may be some open files, (such as logging scripts) that don't properly end.
For example, I used to run a cronjob with the swatch command (a perl script that monitors log files.) However, the old jobs wouldn't properly close, so if I did pgrep swatch (to see how many instances were running) there would only be one, but if I did pgrep perl there would be 10 instances (in a ten day period.) The command Code:
ps -ef |grep <PID_number> will show what process is doing this. It's a relatively uncommon situation, but a useful thing to check if du shows lots of space while df shows almost none. |
Quote:
|
I love it when a plan comes together!
-Hannibal Appreciate the info. |
Quote:
Otherwise the file will be recreated. |
Ahh, but I really needed my computer so as soon as I could do work again I forgot about further de-cruftifying. I have at least two years before I have to worry about it. I'll probably upgrade before then.
|
.xsession-errors
I had the same problem that was driving me mad. every time i'd clear more space on my drive i would come back the next day to see it full again.
Turns out it was my .xsession-errors files that grew to 142G much of it was firefox warnings... now i need to go back and find out what is wrong with firefox but at least my disk is free. |
All times are GMT -5. The time now is 05:48 AM. |