LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - General (https://www.linuxquestions.org/questions/linux-general-1/)
-   -   Automatic deletion of old files when disk is full (https://www.linuxquestions.org/questions/linux-general-1/automatic-deletion-of-old-files-when-disk-is-full-4175467246/)

rdibley 06-24-2013 10:18 PM

Automatic deletion of old files when disk is full
 
I am using the "motion" software to capture still images to a volume of limited size. The images are all roughly the same file size, and are in one directory. Since they will be triggered by movement in the camera view, the creation of the files will be random and sporadic.

I would like to set it up so that, when the volume is full, the oldest files will be deleted to make room for the new files. I was thinking of writing a script that would monitor the directory on a regular basis. When the amount of free space drops below 100MB, it would delete enough files to clear up 200MB of free space. (or something along those lines)

I figure that somebody has had a need to do something like this before, but I haven’t been able to find anything as of yet. Any suggestions?

Thanks.

vishesh 06-25-2013 12:35 AM

You can set script for this.

Thanks

chrism01 06-25-2013 01:25 AM

You could use the logrotate tool, which can be used to manage files generally, or set a cron job to use the find cmd, after using eg df to check disk space.
These will be useful
http://rute.2038bug.com/index.html.gz
http://tldp.org/LDP/Bash-Beginners-G...tml/index.html
http://www.tldp.org/LDP/abs/html/

nsingh63 06-25-2013 10:02 AM

Better u need to use the script to delet the old file by the find command with option mtime or atime and schedule it by cronjob to run as per frequency of the file .

rdibley 06-27-2013 12:15 AM

I figured somebody would have done this before, but I guess not. So, I wrote a script to do the trick:

Code:


#!/bin/bash

# Immediately exit upon shell error
set -e

# Set lock file details
scriptname=$(basename $0)
lock="/var/run/${scriptname}"

# Lock the process, or exit if already running
exec 200>$lock
flock -n 200 || exit 1

# Write the PID to the lock file
pid=$$
echo $pid 1>&200

# Define the location of the applications
DF="/bin/df"
AWK="/usr/bin/awk"
SED="/bin/sed"
FIND="/usr/bin/find"

# Define the directory to check
DIR="/motion"

# Define constants
Minimum=100  # Minimum space below which files will be deleted (MB)
DeleteTo=200  # Delete files until this value is reached (MB)
NumtoDel=100  # Number of files to delete between each free disk space check

# Set shell such that empty file listing will return null
shopt -s nullglob

# Check if free space is less than the minimum specified
FreeSpace=$($DF -m $DIR | $AWK '{print $4}' | $SED "1d")
if [ $FreeSpace -lt $Minimum ]; then
  # Free space has dropped below minimum value, delete files until DeleteTo space is free
  while [ $FreeSpace -lt $DeleteTo ]; do
    # Check if any pictures remain in the directory, otherwise break out of the delete loop
    if test -z "$($FIND $DIR -maxdepth 1 -name '*.jpg' -print -quit)"; then
      break
    fi
    # Delete files until NumtoDel has been reached, or no more files exist
    FileCount=0;
    for FileName in $DIR/*.jpg; do
      rm $FileName
      FileCount=$((FileCount+1))
      # Check if number of files to delete has been reached
      if [ $FileCount -ge $NumtoDel ]; then
        break
      fi
    done
    # Check if free space is less than the minimum specified
    FreeSpace=$($DF -m $DIR | $AWK '{print $4}' | sed "1d")
  done
fi

Basically, you set the minimum free space available to the constant Minimum in the script. When less than this value is free, it will delete files in groups of NumtoDel until DeleteTo MB are free.

I set it to run every 5 minutes. Seems to work fine so far.

Thanks for the help.


All times are GMT -5. The time now is 03:26 AM.