LinuxQuestions.org
Review your favorite Linux distribution.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Security
User Name
Password
Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.

Notices


Reply
  Search this Thread
Old 09-19-2014, 08:37 AM   #1
khuongdp
LQ Newbie
 
Registered: Aug 2005
Posts: 16

Rep: Reputation: 0
scan for non-secure contents on https pages


Anybody know some script/tools to scan a specific https page for non-secure contents. Should be called from command line. Good thing if it support recursive.

Last edited by khuongdp; 09-19-2014 at 08:40 AM.
 
Old 09-19-2014, 10:56 AM   #2
vmccord
Member
 
Registered: Jun 2012
Location: Topeka, KS
Distribution: Mostly AWS
Posts: 71
Blog Entries: 31

Rep: Reputation: Disabled
In what context? Pages that you serve or pages that are being served to you?
 
Old 09-19-2014, 01:30 PM   #3
Habitual
LQ Veteran
 
Registered: Jan 2011
Location: Abingdon, VA
Distribution: Catalina
Posts: 9,374
Blog Entries: 37

Rep: Reputation: Disabled
https://www.rfxn.com/lmd-1-4-1-deliv...your-requests/ for 'local' content?
 
Old 09-19-2014, 03:22 PM   #4
khuongdp
LQ Newbie
 
Registered: Aug 2005
Posts: 16

Original Poster
Rep: Reputation: 0
Before our site is sent to production we want to verify the site does not contains non-secure contents when running in https. The script/tools should be run by our monitoring tools (Sensu) which will execute the script i.e. every minute. Something like nagios plugin for check_http i.e.

check_non-secure_contents.sh -h https://somesite.com -w 1 -c 2
 
Old 09-19-2014, 07:58 PM   #5
jefro
Moderator
 
Registered: Mar 2008
Posts: 22,001

Rep: Reputation: 3629Reputation: 3629Reputation: 3629Reputation: 3629Reputation: 3629Reputation: 3629Reputation: 3629Reputation: 3629Reputation: 3629Reputation: 3629Reputation: 3629
Don't browsers warn of potential issues on a https page?
 
Old 09-20-2014, 12:51 PM   #6
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
A Nagios plugin for this doesn't exist as far as I know. I'd say use a spider, could even 'wget' recursively, then grep stored files for any 'http://' strings?..
 
Old 09-21-2014, 02:26 PM   #7
khuongdp
LQ Newbie
 
Registered: Aug 2005
Posts: 16

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by jefro View Post
Don't browsers warn of potential issues on a https page?
Yes it does. But I need a automatic way to check the site and response back to me when the site does contains non-secure contens. I don't want to open a browsers 100 times at day and with recursive pages it will be a nightmare
 
Old 09-21-2014, 02:33 PM   #8
khuongdp
LQ Newbie
 
Registered: Aug 2005
Posts: 16

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by unSpawn View Post
A Nagios plugin for this doesn't exist as far as I know. I'd say use a spider, could even 'wget' recursively, then grep stored files for any 'http://' strings?..
It looks like the best solution right now. Using the --spider option for wget.
 
Old 09-22-2014, 01:52 PM   #9
vmccord
Member
 
Registered: Jun 2012
Location: Topeka, KS
Distribution: Mostly AWS
Posts: 71
Blog Entries: 31

Rep: Reputation: Disabled
Not to make it more complicated, but grepping for 'http://' isn't enough because one can serve (not that one should) embedded content without necessarily specifying the protocol. For example an iframe src='domain.com/some-content.html'.
 
Old 09-22-2014, 03:31 PM   #10
khuongdp
LQ Newbie
 
Registered: Aug 2005
Posts: 16

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by vmccord View Post
Not to make it more complicated, but grepping for 'http://' isn't enough because one can serve (not that one should) embedded content without necessarily specifying the protocol. For example an iframe src='domain.com/some-content.html'.
You are right. Is it possible to starting a firefox/chrome session with an url as argument and then somehow extract the console information about non-secure contents?
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] Dansguardian not blocking https pages. linuxlover.chaitanya Linux - Server 21 11-07-2010 11:09 PM
Can't See https pages with Squid3 pliqui Linux - Networking 16 04-13-2009 04:05 PM
scan download files over HTTPS using Anti-Virus Proxy cccc Linux - Security 2 05-11-2008 05:29 AM
scan https through dansguardian, clamav and squid hassan2 Ubuntu 1 03-13-2008 03:23 AM
HTTPS Proxy to provide secure access to LAN pages? jantman Linux - Security 2 10-22-2007 09:21 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Security

All times are GMT -5. The time now is 03:54 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration