Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
i am trying to figure out, how to monitor a website through a cronjob with wget or curl.
What i want:
Let my LMint (VM @ home) automatically open or get something on a selfsigned website (company network) to simulate best the out of company network website visit.
what i have:
- URL (of course)
- selfsigned user Certificate (.p12) allready converted with openssl into *crt.pem & *key.pem
2. seemed to work for someone who searched and tried a lot: link
But do i need a server cert aswell? The "--ca-cert=/etc/ssl/certs/winhostname.pem" what is this third certificate?
In the end it does not need to be wget or curl. It just has to be timestamped results if reachable or not, which i can make a chart out of. pinging the webserver does not help since it is running but that horrible tool, reached through this site, is what i want to monitor.
The "--ca-cert=/etc/ssl/certs/winhostname.pem" what is this third certificate?
It’s the certificate that identifies the Certificate Authority that issues the web site’s certificate. To obtain an RHCE certificate (not to be confused with a CA or RHCA certificate) you must be able to say this sentence three times in a row without stuttering.
My totally untrustful explanation: A certificate is like an ID card or passport. It is issued by some authority, and you also need a copy of this authority’s passport for things to work. Certificates of well-known CA’s like Comodo or Thawte are deployed when you install Mint, but wget or curl don’t know your company’s CA. Hence the need to explicitly provide the CA’s certificate on the command line.
If you are just interested in whether or not the site is responding at an application level, use curl -k or wget --no-check-certificate, Either will ignore the self-signed cert and connect anyway.
can i open it through browser with imported cert? yes
=/
You still need whatever other arguments you want to supply, including the target host. I meant that you just add -k or --no-check-certificate to the existing command (whatever one you are using) in your cron.
You still need whatever other arguments you want to supply, including the target host. I meant that you just add -k or --no-check-certificate to the existing command (whatever one you are using) in your cron.
Yes, correct that is what I meant. I'm not sure why you would get resolver errors then. Since DNS resolution works for you in a browser, what is left is mistyping the command or hostname.
Try something like this from a shell prompt just to see if it works.
Code:
curl -I -k https://subdomain.domain.tld/index.php
The -I just limits the response to showing the HTTP headers, so your screen won't fill with web content.
well we get the client cert sent. import it into our browser and only then we can open that website. Due to my understanding there is no authority?!
To close this: The browser keeps a list of CA certificates. When you import a web site's certificate, the browser will check the issuing authority against this list. If the result is negative, you get the usual warning that something is wrong with the web site's security.
With curl's --ca-cert option, you effectively add that CA cert to curl's list.
Or so I understand it.
In case the web site also requires a client certificate, I would guess that the web site would also check it against a CA list.
the command works just fine when i copy paste it into terminal with or without quotes. After fiddling around, i found out that it works without "ts "#%Y-%m-%d %H:%M:%S"" just fine.
i also tried to debug crontab by editing "/etc/rsyslog.d/50-default.conf"
Code:
cron.* /var/log/cron.log
and lookup "/var/log/cron.log"
i found that there is another file called "/var/log/cron.log.1" but both are empty.
Any advice?
I am also searching for an easy way to add an entry if "grep Login" gets no result. But a missing line would do aswell.
I'm not sure what 'ts' does (generates a timestamp?), if it is a script you have written make sure you use the full path to it in the cron. In that environment, a non-standard path won't be in $PATH.
Also, in terms of debugging cron entries - you can have cron send you an email with the output (if any) after a command runs, put this at the top of your crontab:
Code:
MAILTO=me@example.com
or you can capture output in a logfile by adding something like this to your crontab entry:
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.