Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I'm having this scenario which for the moment I cannot resolve.
I wrote a script to make a dump/export of the oracle database. and then put this entry on crontab to be executed daily for example.
The script is like below:
Code:
cat /home/oracle/scripts/db_backup.sh
#!/bin/ksh
#Backup export database script
#Created 05-04-2011
# * * *
DATE=`date +%d%m%Y-%H%M%S`
ARCHIVE_DIR=/home/oracle/arch
SCRIPTS_DIR=/home/oracle/scripts
USER=oracle
PASS=XXXXXXXX
(
echo "Starting database dump ..."
date
cd $ARCHIVE_DIR
exp $USER/$PASS FILE=filename_$DATE.dmp log=logfile_$DATE.log
echo "End of database dump ..."
date
) | tee $ARCHIVE_DIR/exp_logfile-$DATE.log 2>&1
The crontab entry and also the script is executed as oracle user. When I execute directly from the shell the script is executed correctly and the dump is also generated ok.
But on the cron job I get only the log part "Starting/Stoping database dump ..." and not the export dump file and dump_log file: the "exp..." part.
With some help on forums the problem was related to the environment variables
running set
on the two occasions, shell and cron gave different env variables.
So I added at the beginning of the script this part:
. /home/oracle/.profile
Also running the cron job as root with this entry it works: 00 11 * * * su - oracle -c "/home/oracle/scripts/db_backup.sh 2>&1"
Everything OK , the dump export and the dump_log is created ok , but strange because the scripts stops/exits after the exp command and I don't get the last echo and date part of the script.
It is a minor problem , but just to understand why it stops there.
Yes catkin, that means that I'm getting this parts executed correctly:
Code:
echo "Starting database dump ..."
date
exp... dmp file... dmp_log_file...
and that's it , I'm not getting this part:
Code:
echo "End of database dump ..."
date
at the exp_logfile-$DATE.log file
Thanks
and you do get the "End of database dump ..." when you run the script from the command line?
What is the exp command?
Regards ) | tee $ARCHIVE_DIR/exp_logfile-$DATE.log 2>&1, the 2>&1 is directing the stderr of tee to stdout which is to cron. Presumably you want any stederr from the sub-shell in the log, in which case you want ) 2>&1 | tee $ARCHIVE_DIR/exp_logfile-$DATE.log. Bash processes the | before the 2>&1 so has already directed the sub-shell's stdout along the pipe to tee by the time it processes 2>&1 so sends stderr to the same place; it's counter-intuitive that you can't simply read the code left-to-right.
and you do get the "End of database dump ..." when you run the script from the command line?
Yes I get it when I run from the cmd line.
Quote:
Originally Posted by catkin
What is the exp command?
It is the export db (oracle) command , simmilar to mysqldump in MySQL.
Quote:
Originally Posted by catkin
Regards ) | tee $ARCHIVE_DIR/exp_logfile-$DATE.log 2>&1, the 2>&1 is directing the stderr of tee to stdout which is to cron. Presumably you want any stederr from the sub-shell in the log, in which case you want ) 2>&1 | tee $ARCHIVE_DIR/exp_logfile-$DATE.log. Bash processes the | before the 2>&1 so has already directed the sub-shell's stdout along the pipe to tee by the time it processes 2>&1 so sends stderr to the same place; it's counter-intuitive that you can't simply read the code left-to-right.
I tried some possible combinations with 2>&1, removing it from the script, putting it before the tee command etc.. but I'm not getting the expected results.
When I put 2>&1 before tee , I get also a part of the "exp" command log which should be in its log, file, and when I remove 2>&1 from the script I get the same result as it is after the tee command.
Maybe this is related with the exp command of Oracle, but why is that executed correctly from the command line, I've exported the env variables in cron job just like the oracle user
The info, especially that all works as expected when the cron job is run by root as su - oracle -c "/home/oracle/scripts/db_backup.sh 2>&1", suggests that sourcing /home/oracle/.profile goes some way toward simulating what happens when oracle logs on but not all the way.
What to do?
If your ksh is ksh88 then you should be able to fully simulate a login by sourcing /etc/profile and then $HOME/.profile (normally /home/oracle/.profile) according to my understanding of the ksh88 man page.
If your ksh is ksh93 then fully simulating a login is the same as above followed by sourcing any file named in the $ENV variable (from the ksh93 man page).
Alternatively, if you have bash available, you could change the first line of the script to #!/bin/bash --login. The --login tells bash to simulate a login which will include sourcing /etc/profile, ~/.profile (providing neither ~/.bash_profile nor ~/.bash_login exist) and any file named in $ENV (details here). If these shell startup files contain ksh-specific commands, not compatible with bash, then you may not achieve the necessary shell set up.
In all cases the technique will fail if the shell startup files require user input but that should not be so or su - oracle -c "/home/oracle/scripts/db_backup.sh 2>&1" would not work.
Just to explain some details above that maybe I hadn't.
When I said that
Quote:
when the cron job is run by root as *su - oracle -c "/home/oracle/scripts/db_backup.sh 2>&1" it works
I meant only the exp command part of the script which wasn't working before, BUT the result of the total script was not as expected, I wasn't (and am not) getting the last part of the script on my log file (echo "End of db....")
To show some more info and just to understand the reasons of this I'm showing below what I did following your suggestions:
$sh --version
GNU bash, version 3.2.25(1)-release (i686-redhat-linux-gnu)
Copyright (C) 2005 Free Software Foundation, Inc.
Code:
$ ls -lrt /bin/sh
lrwxrwxrwx 1 root root 4 Mar 18 10:08 /bin/sh -> bash
$ ls -lrt /bin/bash
-rwxr-xr-x 1 root root 729292 Oct 21 2008 /bin/bash
$ ls -lrt /bin/ksh
lrwxrwxrwx 1 root root 21 Mar 18 10:10 /bin/ksh -> /etc/alternatives/ksh
$ ls -l /etc/alternatives/ksh
lrwxrwxrwx 1 root root 10 Mar 18 10:10 /etc/alternatives/ksh -> /bin/pdksh
I tested all the possible combinations of changing the first line of the script like you suggested with bash --login and ksh also but no success from cron.
Running this versions of the script from shell it works like expected , all the output...
Maybe I should see more on the Oracle side , regarding exp command and its parameters...
Very puzzling The evidence suggests that running exp changes the stdout of the sub-shell created by the ( ... ) in the script. If exp is an executable that is impossible; it would run in a new process which could not affect its parent's process environment. Perhaps exp is an alias ... what is the output of type exp?
A workaround, pending finding out what is actually happening, might be to force the echo and date output into the log file with something like (untested):
Code:
#!/bin/ksh
#Backup export database script
#Created 05-04-2011
# * * *
DATE=`date +%d%m%Y-%H%M%S`
ARCHIVE_DIR=/home/oracle/arch
SCRIPTS_DIR=/home/oracle/scripts
USER=oracle
PASS=XXXXXXXX
LOGFILE=$ARCHIVE_DIR/exp_logfile-$DATE.log
(
echo "Starting database dump ..."
date
cd $ARCHIVE_DIR
exp $USER/$PASS FILE=filename_$DATE.dmp log=logfile_$DATE.log
echo "End of database dump ..." >> $LOGFILE
date >> $LOGFILE
) 2>&1 | tee $LOGFILE
Last edited by catkin; 04-19-2011 at 08:45 AM.
Reason: fixed missing bolding
Hi catkin,
Thank you for following me on this
So regarding your questions and suggestion please find below:
When I type only the exp command I get:
Code:
Export: Release 10.2.0.5.0 - Production on Wed Apr 20 14:34:04 2011
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Username:
prompting for username.
And when the full command is entered at the command line:
Code:
#exp $USER/$PASS FILE=filename_$DATE.dmp log=logfile_$DATE.log
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Export done in WE8ISO8859P1 character set and AL16UTF16 NCHAR character set
server uses AL32UTF8 character set (possible charset conversion)
About to export specified users ...
. exporting pre-schema procedural objects and actions
. exporting foreign function library names for user USER
. exporting PUBLIC type synonyms
. exporting private type synonyms
. exporting object type definitions for user USER
...............
................
. . exporting table T_BATCH_ERROR_LOG 0 rows exported
................
. . exporting table T_USER_USR 130 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table T_USER_PROFILE_USR 130 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. exporting synonyms
. exporting views
. exporting stored procedures
. exporting operators
. exporting job queues
. exporting refresh groups and children
. exporting dimensions
. exporting statistics
Export terminated successfully with warnings.
After the modification to the script with your suggestion just to bypass this behaviour I get this output on the $LOGFILE
Code:
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Export done in WE8ISO8859P1 character set and AL16UTF16 NCHAR character set
server uses AL32UTF8 character set (possible charset conversion)
About to export specified users ...
..............
............
............
. .
.............
exporting table T_APP_DATE 1 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table T_BATCH_ERROR_LOGEnd of database dump ...
Wed Apr 20 14:38:04 CEST 2011
Appended to the logfile_$DATE.log (but after just this part: T_BATCH_ERROR_LOG not at the end of the log)
Maybe this is the reason why I didn't get the correct output to the $LOGFILE.
For the moment I made these modifications to the script:
Are $LOGFILE ($ARCHIVE_DIR/exp_logfile-$DATE.log) and exp's log (logfile_$DATE.log) the same file? If so, they are not guaranteed to be the same file; if the clock has gone over a second between setting $LOGFILE and bash expanding the exp command then they will have different names. It was for this reason that I set a single variable for use on both the echos and on the exp command.
Form the output you posted it looks as if exp starts something that continues to write to its log file after it returns. In this case a solution might be for the script to sleep for a second (or 5 to be on the safe side) after running exp and before echoing the final messages to the log file.
Are $LOGFILE ($ARCHIVE_DIR/exp_logfile-$DATE.log) and exp's log (logfile_$DATE.log) the same file? If so, they are not guaranteed to be the same file; if the clock has gone over a second between setting $LOGFILE and bash expanding the exp command then they will have different names. It was for this reason that I set a single variable for use on both the echos and on the exp command.
The $LOGFILE and the exp's logfile_$DATE.log are 2 different log files.
That is the reason why I modified the script that way. The reason why I kept 2 log files for this script is that if I used only the $ARCHIVE_DIR/exp_logfile-$DATE.log I didn't get the expected output, the tee recorded only part of the exp's log (not the entire log) and also without the last two rows of the script (echo..., date)
Quote:
Originally Posted by catkin
Form the output you posted it looks as if exp starts something that continues to write to its log file after it returns. In this case a solution might be for the script to sleep for a second (or 5 to be on the safe side) after running exp and before echoing the final messages to the log file.
Yes, that is correct, exp writes to its logfile, and if I add a sleep x seconds line, I think that x should be greater than the time of the exp command to be executed.
I think to test this last part.
I modified the original script to sleep 15 seconds (because the exp command takes approximately 12 sec to complete), and the results are the same as my post #10 (http://www.linuxquestions.org/questi...5/#post4330483).
So I'm keeping the script with the redirection and appends (>, >>) modification for now.
The strange part is because the log stops recording always at this point/table:
Code:
. . exporting table T_BATCH_ERROR_LOGEnd of database dump ...
Wed Apr 20 14:38:04 CEST 2011
and for that reason I don't get anything below it.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.