Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
 |
12-09-2005, 04:42 AM
|
#1
|
LQ Newbie
Registered: Dec 2005
Location: Hyderabad, India
Distribution: Red Hat
Posts: 1
Rep:
|
Inbound web URL filtering
Hi all,
Could any one tell me how can I filter, the inbound URLs of our server. I mean the URL requests sent by the people trying to access our server. How can I use firewalls for that?
thanks and regards,
-anu
|
|
|
12-09-2005, 08:01 AM
|
#2
|
Senior Member
Registered: Mar 2003
Distribution: Fedora
Posts: 3,658
Rep:
|
There are a number of different ways of doing this. With the firewall you can use iptables string matching to match certain URLs and then reject those packets, however Apache's mod_rewrite module is a much better way to do anykind of URL filtering or modification. There is a good guide to using it in the www.apache.org documentation.
|
|
|
12-10-2005, 04:45 AM
|
#3
|
LQ Newbie
Registered: Dec 2005
Location: Hyderabad, India
Distribution: Red Hat
Posts: 5
Rep:
|
Thanks Boss,
But I want to reject all the URLs which are not on my webserver... Or in other words, URls which my web server can't hold... Or is there any possibility, by which I can maintain a database of my web server's urls and when a user requests a page... My server checks against the database... If found legal then only, it feeds the user request...
Thanks n regards,
ANU
|
|
|
12-11-2005, 10:30 AM
|
#4
|
Senior Member
Registered: Mar 2003
Distribution: Fedora
Posts: 3,658
Rep:
|
Using iptables probably is not a good way of doing it then. An application level firewall/proxy like Zorp or even squid might work better. Mod_rewrite has a -U option that can be used for matching URLs that don't exist, however it can use alot of resources in doing that (it makes an internal query of the URL). I don't see why the standard HTTP 404 status code isn't ok? It might be a bit more clear if you explain what you are trying to do (or prevent). For example if you are trying to block malicious HTTP requests, then something like mod_security may be more appropriate.
|
|
|
12-11-2005, 11:45 PM
|
#5
|
LQ Newbie
Registered: Dec 2005
Location: Hyderabad, India
Distribution: Red Hat
Posts: 5
Rep:
|
Thanks Boss,
I was also thinking of modsecurity... Just got confirmation from you... Thanks once again...
Regards,
ANU
|
|
|
All times are GMT -5. The time now is 08:18 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|