LinuxQuestions.org
Review your favorite Linux distribution.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 10-23-2012, 12:33 PM   #1
lce411
Member
 
Registered: Jul 2012
Posts: 50

Rep: Reputation: Disabled
Same script working in one environment, but not another


We have two environments, Test and Pre-Production, and they are supposed to be identical. We have scripts that do certain tasks, such as check for yum updates, reset passwords, delete accounts, etc. Recently the scripts have stopped working in the Test environment. They run fine in the other, but now I get asked for a password each time a server is accessed and it still doesn't perform the desired task.

I realize this may be a vague problem description, but what could cause the same script to behave differently on two (supposedly)similar environments?
 
Old 10-23-2012, 01:15 PM   #2
Habitual
LQ Veteran
 
Registered: Jan 2011
Location: Abingdon, VA
Distribution: Catalina
Posts: 9,374
Blog Entries: 37

Rep: Reputation: Disabled
Quote:
Originally Posted by lce411 View Post
...now I get asked for a password each time a server is accessed and it still doesn't perform the desired task.
...
Are you running this over ssh (with key(s))?

If the script is suitable for posting here, can you?

Thank you,
 
Old 10-23-2012, 01:51 PM   #3
lce411
Member
 
Registered: Jul 2012
Posts: 50

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by Habitual View Post
Are you running this over ssh (with key(s))?

If the script is suitable for posting here, can you?

Thank you,
I had to reset my password within the environment, so I recreated dsa keys after I did that and pushed them to all the servers. That solved the problem of prompting for a password. Now when I run the script in question, it still doesn't do what it should. We have cron jobs on each server that check for updates every Monday and this script is supposed to copy all those results back and create a single file, for importing into Excel. It's not doing this. For anyone viewing this, please respect the intellectual property of others and do not copy this script.
Code:
#!/bin/bash
#
# Script to collect the monthly "yum update" outputs on all servers listed on the command line.
# The script scp's the output of "yum update" from $file_dir (/var/tmp) on each of the servers
# (created using the "yum-check-updates.sh" script run on each machine every Monday via cron),
# then consolidates the list into one, tab-delimited file to be imported into MS Excel.
#
# The script expects a list of servers on the command line to collect the yum check files from.
#
# Example:  "get-yum-update-list.sh <list of servers>"
#

# Check for 1 arguments
if [ $# -lt 1 ]; then
  echo "Usage: $0 <list of servers>"
  echo "Exiting ..."
  echo
  exit 1
fi

# If some servers are not accessible from this server, then set this variable to the
# hostname of the server that has access. For example, to the Dev Enclave servers (that ustc-mgmt
# doesn't have access to).  It's assumed that this server has a copy of this script and a
# "linux.txt" file containing the list of remote servers to collect the data from in your
# home directory.
# Do not list the servers on the command line that this server does not have access to, but
# make sure the file "/home/dsmith/linux.txt" on REMOTE_HOST has the list of servers in it.
REMOTE_HOST=na

# Directory to find and save files
file_dir=/var/tmp

# Copy the files to this server and get the files from the servers accessible from ustc-sftp
# Set the "copy" variable to "1" if you don't want to scp the yum files from the other servers
copy=0
if [ $copy = 0 ]; then
for host in $@
do
  # If REMOTE_HOST is set, then get yum info for the remotely accessible servers
  if [ "x$host" == "x$REMOTE_HOST" ]
  then
    ssh $REMOTE_HOST "/usr/local/sbin/get-yum-update-list.sh `cat linux.txt`" 2>/dev/null
  fi

  # Get the yum info from all the servers
  scp $host:$file_dir/*.yum-check.* $file_dir 2>/dev/null
  chmod g+rw $file_dir/*.yum-check.* 2>/dev/null
  chgrp wheel $file_dir/*.yum-check.* 2>/dev/null
done
fi

# Get the latest Monday's date from the output file of this server
DATE=`ls -1tr $file_dir/*.yum-check* | tail -1 | cut -d. -f3`

# Now consolidate all the yum info into one, space-delimited file

# Get a list of unique packages (including their architecture and version)
declare -a data
cd $file_dir

# The first element in the "data" array is the header line of the output file, which
# includes all the hostnames from the <hostname>.yum-check filenames.
hosts=`ls -1tr *.yum-check.* | awk -F. '{print $1}' | tr "\n" "   " | sed 's/     $/\n/'`
data[0]="Package Task Arch Version Size Priority Notes $hosts"

# Get all unique packages to be updated on any of the boxes and add them to the "data" array
#cat $file_dir/*.yum-check.${DATE} | sed '=' | sed 'N;s/\n/ /g' | tr -s ' ' | sort -u -t" " -k2,5 | sort -n | awk '{print $1,$2,$3","$4","$5,$6}' > $file_dir/all-yum-updates.${DATE}
cat $file_dir/*.yum-check.${DATE} | cut -d" " -f-4 | sort -u | sed "=" | sed 'N;s/\n/ /g' > $file_dir/all-yum-updates.${DATE}
chmod g+rw $file_dir/all-yum-updates.${DATE}
chgrp wheel $file_dir/all-yum-updates.${DATE}

#cat $file_dir/*.yum-check.${DATE} | sed '=' | sed 'N;s/\n/ /g' | tr -s ' ' | sort -u -t" " -k2,5 | sort -n | while read index task pkg_name arch vers hostname
while read index task pkg_name arch vers
do
  # Add the package info to the beginning of the "data" element
  data[$index]="$pkg_name $task $arch $vers tbd tbd tbd"
  # Check for this package in each *yum-check* file and mark the array with an "x" for each server
  for yum_file in `ls -1tr *.yum-check.*`
  do
    grep "${task} $pkg_name $arch $vers" $yum_file > /dev/null
    if [ $? = 0 ]; then
      data[$index]="${data[$index]} y"
    else
      data[$index]="${data[$index]} n"
    fi
  done
done < $file_dir/all-yum-updates.${DATE}

# Add all of the updates (in the "data" array) into a file
i=0
while [ $i -le ${#data[*]} ]
do
  echo ${data[$i]} >> $file_dir/Linux_USTC_Updates_required_`date +%b%Y`.txt
  i=$(( $i + 1 ))
done

exit
 
Old 10-23-2012, 08:09 PM   #4
chrism01
LQ Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.2
Posts: 18,359

Rep: Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751
Well, if the script hasn't changed, then its an env problem.
Exactly what is not happening that should?
Does this script produce any logs of its activity?
Have you tried running it manually, ideally one section at a time to check the cmds?
Try
Code:
#!/bin/bash
set -xv
and run manually.
If its ok, it means the cron env is set wrong.
Generally its a good idea to give the absolute path to all cmds/files used in a cron job, or set the appropriate env vars eg $PATH at the top of the script.

You can collect the output from the 'set -xv' even in cron by
Code:
s h d m dow /path/to/script >/var/log/mylog.log 2>&1
 
Old 10-24-2012, 10:41 AM   #5
lce411
Member
 
Registered: Jul 2012
Posts: 50

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by chrism01 View Post
Well, if the script hasn't changed, then its an env problem.
Exactly what is not happening that should?
Does this script produce any logs of its activity?
Have you tried running it manually, ideally one section at a time to check the cmds?
Try
Code:
#!/bin/bash
set -xv
and run manually.
If its ok, it means the cron env is set wrong.
Generally its a good idea to give the absolute path to all cmds/files used in a cron job, or set the appropriate env vars eg $PATH at the top of the script.

You can collect the output from the 'set -xv' even in cron by
Code:
s h d m dow /path/to/script >/var/log/mylog.log 2>&1
The cron jobs are working. It's the script that we run manually that isn't working. The cron jobs create a text file with a list of applicable updates for that particular RHEL box. The manually-ran script is used to pull all those files to a management server and parse them into a file, for import into Excel. Something has happened and this script is no longer pulling the files from all the servers. I don't see anything in the logs and no output is printed to the screen. It appears to run, but when I check the directory where the files should be, the most recent files are not there.

Update: I ran the script, in question, in verbose mode and it ran all the way through, without any error, but it still did not copy any files over.

Last edited by lce411; 10-24-2012 at 02:16 PM.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Script to enter Chroot Environment kz_chopper Linux From Scratch 6 02-17-2013 03:23 PM
Bash script environment variables mbjunior99 SUSE / openSUSE 4 12-28-2005 12:40 AM
Environment variables not working iansoundz Linux - General 3 12-02-2005 02:28 PM
favorite C working environment? Jaster Programming 5 02-22-2005 10:19 PM
graphical environment not working mouse46 Mandriva 11 08-09-2004 08:33 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 07:31 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration