LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Security
User Name
Password
Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.

Notices


Reply
  Search this Thread
Old 12-09-2005, 04:42 AM   #1
anoop_sweet
LQ Newbie
 
Registered: Dec 2005
Location: Hyderabad, India
Distribution: Red Hat
Posts: 1

Rep: Reputation: 0
Inbound web URL filtering


Hi all,
Could any one tell me how can I filter, the inbound URLs of our server. I mean the URL requests sent by the people trying to access our server. How can I use firewalls for that?

thanks and regards,
-anu
 
Old 12-09-2005, 08:01 AM   #2
Capt_Caveman
Senior Member
 
Registered: Mar 2003
Distribution: Fedora
Posts: 3,658

Rep: Reputation: 69
There are a number of different ways of doing this. With the firewall you can use iptables string matching to match certain URLs and then reject those packets, however Apache's mod_rewrite module is a much better way to do anykind of URL filtering or modification. There is a good guide to using it in the www.apache.org documentation.
 
Old 12-10-2005, 04:45 AM   #3
ANU16
LQ Newbie
 
Registered: Dec 2005
Location: Hyderabad, India
Distribution: Red Hat
Posts: 5

Rep: Reputation: 0
Thanks Boss,
But I want to reject all the URLs which are not on my webserver... Or in other words, URls which my web server can't hold... Or is there any possibility, by which I can maintain a database of my web server's urls and when a user requests a page... My server checks against the database... If found legal then only, it feeds the user request...
Thanks n regards,
ANU
 
Old 12-11-2005, 10:30 AM   #4
Capt_Caveman
Senior Member
 
Registered: Mar 2003
Distribution: Fedora
Posts: 3,658

Rep: Reputation: 69
Using iptables probably is not a good way of doing it then. An application level firewall/proxy like Zorp or even squid might work better. Mod_rewrite has a -U option that can be used for matching URLs that don't exist, however it can use alot of resources in doing that (it makes an internal query of the URL). I don't see why the standard HTTP 404 status code isn't ok? It might be a bit more clear if you explain what you are trying to do (or prevent). For example if you are trying to block malicious HTTP requests, then something like mod_security may be more appropriate.
 
Old 12-11-2005, 11:45 PM   #5
ANU16
LQ Newbie
 
Registered: Dec 2005
Location: Hyderabad, India
Distribution: Red Hat
Posts: 5

Rep: Reputation: 0
Thanks Boss,
I was also thinking of modsecurity... Just got confirmation from you... Thanks once again...

Regards,
ANU
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
URL Filtering in NAT+BIND9 TheAce Linux - Networking 8 11-08-2005 01:40 AM
Web filtering with a router dearborn98 Linux - Networking 3 05-28-2005 09:05 AM
set URL filtering in mozilla Trio3b Linux - Security 2 04-15-2005 06:19 AM
url filtering using Squid RajaRC Red Hat 1 11-07-2003 07:21 AM
Web filtering: URL filt or rate content..or? Linux Learning Linux - Security 2 10-01-2003 07:46 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Security

All times are GMT -5. The time now is 03:10 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration