LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Software (https://www.linuxquestions.org/questions/linux-software-2/)
-   -   Anacron email - Segmentation Fault - Ubuntu (https://www.linuxquestions.org/questions/linux-software-2/anacron-email-segmentation-fault-ubuntu-848941/)

devnull10 12-07-2010 06:30 PM

Anacron email - Segmentation Fault - Ubuntu
 
My sisters machine is running ubuntu and having just logged in as root I noticed I had some emails from anacron.
These all read:

Code:

Return-Path: <root@xxx>
X-Original-To: root
Delivered-To: root@xxx
Received: by xxx (Postfix, from userid 0)
        id 5B4961FF18; Wed,  1 Dec 2010 07:45:38 +0000 (GMT)
From: Anacron <root@xxx>
To: root@xxx
Subject: Anacron job 'cron.daily' on xxx
Message-Id: <20101201074538.5B4961FF18@xxx>
Date: Wed,  1 Dec 2010 07:45:38 +0000 (GMT)

/etc/cron.daily/apt:
Segmentation fault


Now, I use cron on slax with no problems at all, but I have never set up any cron jobs on the ubuntu machine so I can only assume it's some jobs which are scheduled by the default install?
Having a little dig around, here what I have so far:

/etc/anacrontab
Code:

# /etc/anacrontab: configuration file for anacron

# See anacron(8) and anacrontab(5) for details.

SHELL=/bin/sh
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin

# These replace cron's entries
1      5      cron.daily      nice run-parts --report /etc/cron.daily
7      10      cron.weekly      nice run-parts --report /etc/cron.weekly
@monthly        15      cron.monthly nice run-parts --report /etc/cron.monthly

/etc/cron.daily
Code:

# ls -la
total 72
drwxr-xr-x  2 root root  4096 2010-10-07 17:15 .
drwxr-xr-x 148 root root 12288 2010-12-08 00:06 ..
-rwxr-xr-x  1 root root  311 2010-06-20 09:11 0anacron
-rwxr-xr-x  1 root root  189 2010-10-01 13:17 apport
-rwxr-xr-x  1 root root 15655 2010-10-05 15:09 apt
-rwxr-xr-x  1 root root  502 2010-05-26 21:02 bsdmainutils
-rwxr-xr-x  1 root root  256 2010-09-08 10:52 dpkg
-rwxr-xr-x  1 root root    89 2010-08-05 22:04 logrotate
-rwxr-xr-x  1 root root  1335 2010-08-17 16:10 man-db
-rwxr-xr-x  1 root root  606 2010-03-24 10:16 mlocate
-rw-r--r--  1 root root  102 2010-08-25 00:01 .placeholder
-rwxr-xr-x  1 root root  2149 2009-06-16 14:12 popularity-contest
-rwxr-xr-x  1 root root  3594 2010-08-25 00:01 standard

/etc/cron.weekly
Code:

# ls -l
total 12
-rwxr-xr-x 1 root root 312 2010-06-20 09:11 0anacron
-rwxr-xr-x 1 root root 748 2010-09-23 18:05 apt-xapian-index
-rwxr-xr-x 1 root root 895 2010-08-17 16:10 man-db

I've tried running all the jobs individually but none produce a segmentation fault problem.
The email is a daily one so looks like it's coming from cron.daily.


Any suggestions on what might be causing this?

ozanbaba 12-08-2010 12:14 PM

What's the concents of /etc/cron.daily/apt


and do you use prelink?

devnull10 12-11-2010 11:10 AM

/etc/cron.daily/apt

Code:

#!/bin/sh
#set -e
#
# This file understands the following apt configuration variables:
# Values here are the default.
# Create /etc/apt/apt.conf.d/02periodic file to set your preference.
#
#  Dir "/";
#  - RootDir for all configuration files
#
#  Dir::Cache "var/apt/cache/";
#  - Set apt package cache directory
#
#  Dir::Cache::Archive "archives/";
#  - Set package archive directory
#
#  APT::Periodic::Enable "1";
#  - Enable the update/upgrade script (0=disable)
#
#  APT::Periodic::BackupArchiveInterval "0";
#  - Backup after n-days if archive contents changed.(0=disable)
#
#  APT::Periodic::BackupLevel "3";
#  - Backup level.(0=disable), 1 is invalid.
#
#  Dir::Cache::Backup "backup/";
#  - Set periodic package backup directory
#
#  APT::Archives::MaxAge "0"; (old, deprecated)
#  APT::Periodic::MaxAge "0"; (new)
#  - Set maximum allowed age of a cache package file. If a cache
#    package file is older it is deleted (0=disable)
#
#  APT::Archives::MinAge "2"; (old, deprecated)
#  APT::Periodic::MinAge "2"; (new)
#  - Set minimum age of a package file. If a file is younger it
#    will not be deleted (0=disable). Usefull to prevent races
#    and to keep backups of the packages for emergency.
#
#  APT::Archives::MaxSize "0"; (old, deprecated)
#  APT::Periodic::MaxSize "0"; (new)
#  - Set maximum size of the cache in MB (0=disable). If the cache
#    is bigger, cached package files are deleted until the size
#    requirement is met (the biggest packages will be deleted
#    first).
#
#  APT::Periodic::Update-Package-Lists "0";
#  - Do "apt-get update" automatically every n-days (0=disable)
#   
#  APT::Periodic::Download-Upgradeable-Packages "0";
#  - Do "apt-get upgrade --download-only" every n-days (0=disable)
#
#  APT::Periodic::Download-Upgradeable-Packages-Debdelta "1";
#  - Use debdelta-upgrade to download updates if available (0=disable)
#
#  APT::Periodic::Unattended-Upgrade "0";
#  - Run the "unattended-upgrade" security upgrade script
#    every n-days (0=disabled)
#    Requires the package "unattended-upgrades" and will write
#    a log in /var/log/unattended-upgrades
#
#  APT::Periodic::AutocleanInterval "0";
#  - Do "apt-get autoclean" every n-days (0=disable)
#
#  APT::Periodic::Verbose "0";
#  - Send report mail to root
#      0:  no report            (or null string)
#      1:  progress report      (actually any string)
#      2:  + command outputs    (remove -qq, remove 2>/dev/null, add -d)
#      3:  + trace on                                                                             

check_stamp()
{
    stamp="$1"
    interval="$2"

    if [ $interval -eq 0 ]; then
        debug_echo "check_stamp: interval=0"
        # treat as no time has passed
        return 1
    fi

    if [ ! -f $stamp ]; then
        debug_echo "check_stamp: missing time stamp file: $stamp."
        # treat as enough time has passed
        return 0
    fi

    # compare midnight today to midnight the day the stamp was updated
    stamp_file="$stamp"
    stamp=$(date --date=$(date -r $stamp_file --iso-8601) +%s 2>/dev/null)
    if [ "$?" != "0" ]; then
        # Due to some timezones returning 'invalid date' for midnight on
        # certain dates (eg America/Sao_Paulo), if date returns with error
        # remove the stamp file and return 0. See coreutils bug:
        # http://lists.gnu.org/archive/html/bug-coreutils/2007-09/msg00176.html
        rm -f "$stamp_file"
        return 0
    fi

    now=$(date --date=$(date --iso-8601) +%s 2>/dev/null)
    if [ "$?" != "0" ]; then
        # As above, due to some timezones returning 'invalid date' for midnight
        # on certain dates (eg America/Sao_Paulo), if date returns with error
        # return 0.
        return 0
    fi

    delta=$(($now-$stamp))

    # intervall is in days, convert to sec.
    interval=$(($interval*60*60*24))
    debug_echo "check_stamp: interval=$interval, now=$now, stamp=$stamp, delta=$delta (sec)"

    # remove timestamps a day (or more) in the future and force re-check
    if [ $stamp -gt $(($now+86400)) ]; then
        echo "WARNING: file $stamp_file has a timestamp in the future: $stamp"
        rm -f "$stamp_file"
        return 0
    fi

    if [ $delta -ge $interval ]; then
        return 0
    fi

    return 1
}

update_stamp()
{
    stamp="$1"
    touch $stamp
}

# we check here if autoclean was enough sizewise
check_size_constraints()
{{
    MaxAge=0                                                                                     
    eval $(apt-config shell MaxAge APT::Archives::MaxAge)
    eval $(apt-config shell MaxAge APT::Periodic::MaxAge)

    MinAge=2
    eval $(apt-config shell MinAge APT::Archives::MinAge)
    eval $(apt-config shell MinAge APT::Periodic::MinAge)

    MaxSize=0
    eval $(apt-config shell MaxSize APT::Archives::MaxSize)
    eval $(apt-config shell MaxSize APT::Periodic::MaxSize)

    CacheDir="var/cache/apt/"
    eval $(apt-config shell CacheDir Dir::Cache)
    CacheDir=${CacheDir%/}

    CacheArchive="archives/"
    eval $(apt-config shell CacheArchive Dir::Cache::archives)
    CacheArchive=${CacheArchive%/}

    # sanity check
    if [ -z "$CacheDir" -o -z "$CacheArchive" ]; then
        echo "empty Dir::Cache or Dir::Cache::archives, exiting"
        exit
    fi
   
    Cache="${Dir%/}/${CacheDir%/}/${CacheArchive%/}/"

    # check age
    if [ ! $MaxAge -eq 0 ] && [ ! $MinAge -eq 0 ]; then
        debug_echo "aged: ctime <$MaxAge and mtime <$MaxAge and ctime>$MinAge and mtime>$MinAge"
        find $Cache -name "*.deb"  \( -mtime +$MaxAge -and -ctime +$MaxAge \) -and -not \( -mtime -$MinAge -or -ctime -$MinAge \) -print0 | xargs -r -0 rm -f
    elif [ ! $MaxAge -eq 0 ]; then
        debug_echo "aged: ctime <$MaxAge and mtime <$MaxAge only"
        find $Cache -name "*.deb"  -ctime +$MaxAge -and -mtime +$MaxAge -print0 | xargs -r -0 rm -f
    else
        debug_echo "skip aging since MaxAge is 0"
    fi
   
    # check size
    if [ ! $MaxSize -eq 0 ]; then
        # maxSize is in MB
        MaxSize=$(($MaxSize*1024))

        #get current time
        now=$(date --date=$(date --iso-8601) +%s)
        MinAge=$(($MinAge*24*60*60))

        # reverse-sort by mtime
        for file in $(ls -rt $Cache/*.deb 2>/dev/null); do
            du=$(du -s $Cache)
            size=${du%%/*}
            # check if the cache is small enough
            if [ $size -lt $MaxSize ]; then
                debug_echo "end remove by archive size:  size=$size < $MaxSize"
                break
            fi

            # check for MinAge of the file
            if [ $MinAge -ne 0 ]; then
                # check both ctime and mtime
                mtime=$(stat -c %Y $file)
                ctime=$(stat -c %Z $file)
                if [ $mtime -gt $ctime ]; then
                    delta=$(($now-$mtime))
                else
                    delta=$(($now-$ctime))
      fi                                                                               
                if [ $delta -le $MinAge ]; then
                    debug_echo "skip remove by archive size:  $file, delta=$delta < $MinAgeSec"
                    break
                else
                    # delete oldest file
                    debug_echo "remove by archive size: $file, delta=$delta >= $MinAgeSec (sec), size=$size >= $MaxSize"
                    rm -f $file
                fi
            fi
        done
    fi
}

# deal with the Apt::Periodic::BackupArchiveInterval
do_cache_backup()
{
    BackupArchiveInterval="$1"
    if [ $BackupArchiveInterval -eq 0 ]; then
        return
    fi

    # Set default values and normalize
    Dir="/"
    eval $(apt-config shell Dir Dir)
    Dir=${Dir%/}

    CacheDir="var/cache/apt/"
    eval $(apt-config shell CacheDir Dir::Cache)
    CacheDir=${CacheDir%/}
    if [ -z "$CacheDir" ]; then
        debug_echo "practically empty Dir::Cache, exiting"
        return 0
    fi

    CacheArchive="archives/"
    eval $(apt-config shell CacheArchive Dir::Cache::Archives)
    CacheArchive=${CacheArchive%/}
    if [ -z "$CacheArchive" ]; then
        debug_echo "practically empty Dir::Cache::archives, exiting"
        return 0
    fi

    BackupLevel=3
    eval $(apt-config shell BackupLevel APT::Periodic::BackupLevel)
    if [ $BackupLevel -le 1 ]; then
        BackupLevel=2 ;
    fi
   
    CacheBackup="backup/"
    eval $(apt-config shell CacheBackup Dir::Cache::Backup)
    CacheBackup=${CacheBackup%/}
    if [ -z "$CacheBackup" ]; then
        echo "practically empty Dir::Cache::Backup, exiting" 1>&2
        return
    fi

    Cache="${Dir}/${CacheDir}/${CacheArchive}/"
    Back="${Dir}/${CacheDir}/${CacheBackup}/"
    BackX="${Back}${CacheArchive}/"
    for x in $(seq 0 1 $((${BackupLevel}-1))); do
        eval "Back${x}=${Back}${x}/"
    done
   
    # backup after n-days if archive contents changed.
    # (This uses hardlink to save disk space)
    BACKUP_ARCHIVE_STAMP=/var/lib/apt/periodic/backup-archive-stamp
 if check_stamp $BACKUP_ARCHIVE_STAMP $BackupArchiveInterval; then
        if [ $({(cd $Cache 2>/dev/null; find . -name "*.deb"); (cd $Back0 2>/dev/null;find . -name "*.deb") ;}| sort|uniq -u|wc -l) -ne 0 ]; then

    mkdir -p $Back                                                                                                                                   
            rm -rf $Back$((${BackupLevel}-1))
            for y in $(seq $((${BackupLevel}-1)) -1 1); do
                eval BackY=${Back}$y
                eval BackZ=${Back}$(($y-1))
                if [ -e $BackZ ]; then
                    mv -f $BackZ $BackY ;
                fi
            done
            cp -la $Cache $Back ; mv -f $BackX $Back0
            update_stamp $BACKUP_ARCHIVE_STAMP
            debug_echo "backup with hardlinks. (success)"
        else
            debug_echo "skip backup since same content."
        fi
    else
        debug_echo "skip backup since too new."
    fi
}

# sleep for a random interval of time (default 30min)
# (some code taken from cron-apt, thanks)
random_sleep()
{
    RandomSleep=1800
    eval $(apt-config shell RandomSleep APT::Periodic::RandomSleep)
    if [ $RandomSleep -eq 0 ]; then
        return
    fi
    if [ -z "$RANDOM" ] ; then
        # A fix for shells that do not have this bash feature.
        RANDOM=$(dd if=/dev/urandom count=1 2> /dev/null | cksum | cut -c"1-5")
    fi
    TIME=$(($RANDOM % $RandomSleep))
    debug_echo "sleeping for $TIME seconds"
    sleep $TIME
}


debug_echo()
{
    # Display message if $VERBOSE >= 1
    if [ "$VERBOSE" -ge 1 ]; then
        echo $1 1>&2
    fi
}

# ------------------------ main ----------------------------

# Backup the 7 last versions of APT's extended_states file
# shameless copy from dpkg cron
if cd /var/backups ; then
    if ! cmp -s apt.extended_states.0 /var/lib/apt/extended_states; then
        cp -p /var/lib/apt/extended_states apt.extended_states
        savelog -c 7 apt.extended_states >/dev/null
    fi
fi

# check apt-config exstance
if ! which apt-config >/dev/null ; then
        exit 0
fi

# check if the user really wants to do something
AutoAptEnable=1  # default is yes
eval $(apt-config shell AutoAptEnable APT::Periodic::Enable)

if [ $AutoAptEnable -eq 0 ]; then
    exit 0                                                                                                                                                   
fi

# Set VERBOSE mode from  apt-config (or inherit from environment)
VERBOSE=0
eval $(apt-config shell VERBOSE APT::Periodic::Verbose)
debug_echo "verbose level $VERBOSE"
if [ "$VERBOSE" -le 2 ]; then
    # quiet for 0,1,2
    XSTDOUT=">/dev/null"
    XSTDERR="2>/dev/null"
    XAPTOPT="-qq"
    XUUPOPT=""
else
    XSTDOUT=""
    XSTDERR=""
    XAPTOPT=""
    XUUPOPT="-d"
fi
if [ "$VERBOSE" -ge 3 ]; then
    # trace output
    set -x
fi

# laptop check, on_ac_power returns:
#      0 (true)    System is on main power
#      1 (false)  System is not on main power
#      255 (false) Power status could not be determined
# Desktop systems always return 255 it seems
if which on_ac_power >/dev/null; then
    on_ac_power
    POWER=$?
    if [ $POWER -eq 1 ]; then
        debug_echo "exit: system NOT on main power"
        exit 0
    elif [ $POWER -ne 0 ]; then
        debug_echo "power status ($POWER) undetermined, continuing"
    fi
    debug_echo "system is on main power."
fi

# check if we can lock the cache and if the cache is clean
if which apt-get >/dev/null && ! eval apt-get check -f $XAPTOPT $XSTDERR ; then
    debug_echo "error encountered in cron job with \"apt-get check\"."
    exit 0
fi

# Global current time in seconds since 1970-01-01 00:00:00 UTC
now=$(date +%s)

# Support old Archive for compatibility.
# Document only Periodic for all controling parameters of this script.

UpdateInterval=0
eval $(apt-config shell UpdateInterval APT::Periodic::Update-Package-Lists)

DownloadUpgradeableInterval=0
eval $(apt-config shell DownloadUpgradeableInterval APT::Periodic::Download-Upgradeable-Packages)

UnattendedUpgradeInterval=0
eval $(apt-config shell UnattendedUpgradeInterval APT::Periodic::Unattended-Upgrade)

AutocleanInterval=0
eval $(apt-config shell AutocleanInterval APT::Periodic::AutocleanInterval)

BackupArchiveInterval=0
eval $(apt-config shell BackupArchiveInterval APT::Periodic::BackupArchiveInterval)

Debdelta=1                                                                                                                                                   
eval $(apt-config shell Debdelta APT::Periodic::Download-Upgradeable-Packages-Debdelta)

# check if we actually have to do anything that requires locking the cache
if [ $UpdateInterval -eq 0 ] &&
  [ $DownloadUpgradeableInterval -eq 0 ] &&
  [ $UnattendedUpgradeInterval -eq 0 ] &&
  [ $BackupArchiveInterval -eq 0 ] &&
  [ $AutocleanInterval -eq 0 ]; then

    # check cache size
    check_size_constraints

    exit 0
fi

# deal with BackupArchiveInterval
do_cache_backup $BackupArchiveInterval

# sleep random amount of time to avoid hitting the
# mirrors at the same time
random_sleep

# include default system language so that "apt-get update" will
# fetch the right translated package descriptions
if [ -r /etc/default/locale ]; then
    . /etc/default/locale
    export LANG LANGUAGE LC_MESSAGES LC_ALL
fi

# update package lists
UPDATED=0
UPDATE_STAMP=/var/lib/apt/periodic/update-stamp
if check_stamp $UPDATE_STAMP $UpdateInterval; then
    # check for a new archive signing key (against the master keyring)
    if eval apt-key net-update $XSTDERR; then
      debug_echo "apt-key net-update (success)"
    else
      debug_echo "apt-key net-update (failure)"
    fi
    # run apt-get update
    if eval apt-get $XAPTOPT -y update $XSTDERR; then
        debug_echo "download updated metadata (success)."
        if which dbus-send >/dev/null && pidof dbus-daemon >/dev/null; then
            if dbus-send --system / app.apt.dbus.updated boolean:true ; then
                debug_echo "send dbus signal (success)"
            else
                debug_echo "send dbus signal (error)"
            fi
        else
            debug_echo "dbus signal not send (command not available)"
        fi
        update_stamp $UPDATE_STAMP
        UPDATED=1
        # now run apt-xapian-index if it is installed to ensure the index
        # is up-to-date
        if [ -x /usr/sbin/update-apt-xapian-index ]; then
            nice ionice -c3 update-apt-xapian-index -q
        fi
    else
        debug_echo "download updated metadata (error)"
    fi
else
    debug_echo "download updated metadata (not run)."
fi

# download all upgradeable packages (if it is requested)
DOWNLOAD_UPGRADEABLE_STAMP=/var/lib/apt/periodic/download-upgradeable-stamp
if [ $UPDATED -eq 1 ] && check_stamp $DOWNLOAD_UPGRADEABLE_STAMP $DownloadUpgradeableInterval; then                                                         
    if [ $Debdelta -eq 1 ]; then
        debdelta-upgrade >/dev/null 2>&1 || true
    fi
    if  eval apt-get $XAPTOPT -y -d dist-upgrade $XSTDERR; then
        update_stamp $DOWNLOAD_UPGRADEABLE_STAMP
        debug_echo "download upgradable (success)"
    else
        debug_echo "download upgradable (error)"
    fi
else
    debug_echo "download upgradable (not run)"
fi

# auto upgrade all upgradeable packages
UPGRADE_STAMP=/var/lib/apt/periodic/upgrade-stamp
if [ $UPDATED -eq 1 ] && which unattended-upgrade >/dev/null && check_stamp $UPGRADE_STAMP $UnattendedUpgradeInterval; then
    if unattended-upgrade $XUUPOPT; then
        update_stamp $UPGRADE_STAMP
        debug_echo "unattended-upgrade (success)"
    else
        debug_echo "unattended-upgrade (error)"
    fi
else
    debug_echo "unattended-upgrade (not run)"
fi

# autoclean package archive
AUTOCLEAN_STAMP=/var/lib/apt/periodic/autoclean-stamp
if check_stamp $AUTOCLEAN_STAMP $AutocleanInterval; then
    if  eval apt-get $XAPTOPT -y autoclean $XSTDERR; then
        debug_echo "autoclean (success)."
        update_stamp $AUTOCLEAN_STAMP
    else
        debug_echo "autoclean (error)"
    fi
else
    debug_echo "autoclean (not run)"
fi

# check cache size
check_size_constraints

#
#    vim: set sts=4 ai :
#

I don't knowingly use prelink.

ozanbaba 12-13-2010 03:51 AM

THat's just weird now. I don't see anything would cause SEGFAULT. I don't think Ubuntu uses prelink[1] by default. My wild guess is that It's something to do with network communication. Try a manual update.

[1]Non-prelinked binaries may face SEGFAULT with prelinked libraries. I know prelinked OpenSSL Library gave me SEGFAULT while coding.

devnull10 12-13-2010 04:51 PM

running
Code:

apt-get update; apt-get upgrade
works fine which I believe is the Ubuntu equivalent to running "-current"?

Should I just remove it from cron and then update manually? It's not a big deal really, it just doesn't seem much point in running it if it's going to segfault! Strange how running normally it works fine though (ie, running the apt script from the shell prompt). I have removed the sleep from the file (so I don't have to wait up to half an hour lol).

devnull10 12-13-2010 04:54 PM

Seems I might have answered this myself (well, found someone else who has answered it! :) )

http://www.tummy.com/journals/entrie...0101203_060502

I've upgraded the python package and will now see what happens on the next cron run...


All times are GMT -5. The time now is 07:43 PM.