[SOLVED] how to trap or capture an error from a hash check
Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have a script , working nicely , where i
do a md5sum check on a file and I print to a log
md5sum -c mypath/myFile.mp4 | tee -a myLogPath/mylog.txt
my actual command with my Vars is this
md5sum -c $pathdeliver/$f | tee -a $mypathdeliver/hashCheck_$ClientDrv_$nTexDrv.txt
as you can see I am using a piped into tee to save the output into a file
and as you know one expects this to output to the terminal and print in the txt file something like
myFile.mp4: OK
Here is what i wish to do ...and I am stumped as to how to do it
IF
the check is NOT ok
where the output would be
myFile.mp4: FAILED
md5sum: WARNING: 1 computed checksum did NOT match
I would like to write a diffrent txt file . call it an error log
so if FAILED then I print something like
filename and failed into a error_Hash.txt
so this error_Hash.txt would only be made if there is a fail
#md5sum -c /media/C_001/UNTA_AR0776-686057-LT630-25.mkv.md5 | tee result.txt
md5sum -c UNTA_AR0776-686057-LT630-25.mkv.md5 | tee result.txt
# md5sum -c /media/C_001/UNTA_AR0776-686057-LT630-26.mkv.md5 # | tee result.txt
# md5sum -c UNTA_AR0776-686057-LT630-26.mkv.md5
# | tee result.txt
if [ $? -eq 0 ]; then
echo all is well
else
echo something went wrong
fi
that is not working
what i am presently doing and I think ...works well enough
is to utilize the file that I am doing a tee into
read the last line
check for FAIL
like so
#!/bin/bash
cd /media/C_001
for f in *.md5; do
echo working on $f
md5sum -c /media/C_001/$f | tee -a /media/C_001/result.txt
lastLn=$( tail -n 1 /media/C_001/result.txt )
echo my last line was $lastLn
if [[ "$lastLn" =~ "FAILED" ]]; then
echo "the hash is bad"
echo $lastln F
echo $lastln FF
echo $f Fail >> /media/C_001/err_log.txt
else
echo "all is well"
fi
The classic method:
save the md5sum output in a variable,
check exit status,
print the variable.
Once you have a variable you can print it twice instead of duping with tee. And a print can be efficiently directed to an output stream (that is associated with a file once when the loop started)
Code:
for f in *.md5; do
echo "working on $f"
if
output=$(md5sum -c /media/C_001/"$f")
then
: good
printf "%s\n" "$output" >&3
else
: bad
printf "%s\n" "$output" >&4
fi
printf "%s\n" "$output"
done 3>/media/C_001/result.txt 4>/media/C_001/err_log.txt
You mean the stderr (descriptor 2) is not captured?
No problem, simply direct to the error stream
Code:
output=$(md5sum -c /media/C_001/"$f") 2>&4
Or
Code:
output=$(md5sum -c /media/C_001/"$f" 2>&4)
Explanation for the latter: The (subshell) gets the file descriptors cloned in the same way as the variables are cloned.
Of course you can also merge stderr into the output variable
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.