LinuxQuestions.org
Review your favorite Linux distribution.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 07-18-2020, 05:04 AM   #16
orangepeel190
Member
 
Registered: Aug 2016
Posts: 69

Original Poster
Rep: Reputation: Disabled

Turbo:

Ive created a script and called it "webchecker"
Ive placed the script into a Bash Script and the results were as follows:


Code:
if /usr/bin/true; then
        echo "OK"
else
        echo "Not OK"
fi
#
if /usr/bin/false; then
        echo "OK"
else
        echo "Not OK"
fi
#
if curl --silent --head $url/$file | grep -q -c 1 -P '^HTTP/\w\.\w\s200\sOK'; then
        echo "OK"
else
        echo "Try later"
fi
Results were:
Quote:
./webchecker: line 205: /usr/bin/true: No such file or directory
Not OK
./webchecker: line 211: /usr/bin/false: No such file or directory
Not OK
grep: ^HTTP/\w\.\w\s200\sOK: No such file or directory
(23) Failed writing body
Try later
I have tried this on known working URLs and Filenames as well as bogus filenames (same URL) and the results were the same ....
Any suggestions to have this operate without the need to download the file all the time.

Shouldn't make too much difference - this is running on a Pi
 
Old 07-18-2020, 05:13 AM   #17
Turbocapitalist
LQ Guru
 
Registered: Apr 2005
Distribution: Linux Mint, Devuan, OpenBSD
Posts: 7,333
Blog Entries: 3

Rep: Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730
You've posted only an excerpt from the shell script. According to the error messages there are hundreds of lines. Start small. Make a small separate script and then when you have that working port it into the larger monstrosity. Note that the paths might be different on various distros...

See also: https://www.shellcheck.net/
 
1 members found this post helpful.
Old 07-18-2020, 05:16 AM   #18
pan64
LQ Addict
 
Registered: Mar 2012
Location: Hungary
Distribution: debian/ubuntu/suse ...
Posts: 21,970

Rep: Reputation: 7334Reputation: 7334Reputation: 7334Reputation: 7334Reputation: 7334Reputation: 7334Reputation: 7334Reputation: 7334Reputation: 7334Reputation: 7334Reputation: 7334
Quote:
Originally Posted by orangepeel190 View Post
pan64 = that appears to download the file rather than checking if it’s available... is there an alternative to downloading and check via a curl command or result (potentially in the header)?
why curl? There are other tools (wget was mentioned, but python/perl/whatever are ok too) which can handle this much better.
 
Old 07-18-2020, 07:43 AM   #19
orangepeel190
Member
 
Registered: Aug 2016
Posts: 69

Original Poster
Rep: Reputation: Disabled
Thanks Turbo.

The code in the previous post was the only thing contained in the script.... except for bin/bash

After bashing the keyboard, I have this working (only creating a small file for comparison use)

Code:
curl -sIo check $url/$file
compare=$(grep "404" "check")
if [[ -n $compare ]]; then
echo "NOPE = $(grep "404" "check")"
echo "File NOT here"
else
echo "YEES! = $(grep "200" "check")"
echo "File IS here!"
fi
A touch rough - but seems to do the trick without downloading the entire file (just the header), which will give the option to add a command in the relevant IF section of the script if desired.

Thoughts or ways to better the script?
 
Old 07-18-2020, 07:52 AM   #20
shruggy
Senior Member
 
Registered: Mar 2020
Posts: 3,677

Rep: Reputation: Disabled
See Turbocapitalist's answer in #13.
Code:
#!/bin/sh
if curl -sI $url/$file |grep -wq 404; then
  echo "File NOT here"
else
  echo "File IS here!"
fi
 
1 members found this post helpful.
Old 07-18-2020, 01:45 PM   #21
scasey
LQ Veteran
 
Registered: Feb 2013
Location: Tucson, AZ, USA
Distribution: CentOS 7.9.2009
Posts: 5,735

Rep: Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212
Quote:
Originally Posted by orangepeel190 View Post
Thanks scasey,

I was simply having an attempt at some scripting, rather than be “one of those people” that simply asks someone else to do all the work for them.

Yes, I am aware that -f will see if the file is available on the local disc, maybe it was not the best to use that “-f” statement given the request to see if the file is simply available on the website. I was interested to see where the scripting in #5 would best go to give it a go.

The issue that I am seeing is that the script could download a error message imposed as $file which the system will see as a Pass.

Classic example was this morning, I ran a script thinking it was downloading the file, yet when I explored deeper, the file looked like the mp3 file, but the message below: cat $file



The -f resulted in a “Yes, the file was downloaded”, when it clearly was not the audio file. I am now having to add an external conditional statement in the script to ensure the file is larger than, say, 1Mb. If the file is downloaded and larger than 1Mb, then “Success, the file was downloaded”, if not, delete the small file and Error out - try again later.

It would be good to not have to download the file IF is not the correct file or not even available..... kinda gotten a little bigger problem than a simple download script.

Happy to try as many options as possible to get a functioning script and learn in the process.
Appreciate the feedback and assistance to my steep learning curve....
I think you're missing my point. The variable is not dependent on the results of the curl/wget in any way, as coded. It merely contains the name of the file for which you are searching.

You need to capture the results of the web query into another variable and test the contents of that.

But you are correct that any query is probably going to give you a result. The only time I think what wouldn't happen is if the server is not found at all. I'm not familiar with curl/wget, but I think you'll need to search within the resulting response to see if it contains $file...or something like that. Another use for the new variable
 
Old 07-19-2020, 01:47 AM   #22
orangepeel190
Member
 
Registered: Aug 2016
Posts: 69

Original Poster
Rep: Reputation: Disabled
A big Thanks to those who assisted this venture.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Chromium - spurious "Webpage not available" cgr Linux - Software 2 11-10-2017 01:25 AM
webpage in webpage? kalleanka Programming 6 06-07-2009 04:13 PM
Any command available to find out the number of inodes available in the system? Marty21 Linux - Newbie 3 01-09-2009 07:31 PM
Can't find the my apache server default webpage Doug.Gentry Linux - General 3 07-29-2006 06:38 AM
webpage of available linux command ? nazib Linux - General 1 12-12-2004 12:36 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 06:46 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration