check spamhaus lasso

Marc Powell marc at ena.com
Fri Jan 6 16:30:38 CET 2006



> -----Original Message-----
> From: nagios-users-admin at lists.sourceforge.net [mailto:nagios-users-
> admin at lists.sourceforge.net] On Behalf Of Greg Martin
> Sent: Friday, January 06, 2006 9:06 AM
> To: nagios-users at lists.sourceforge.net
> Subject: [Nagios-users] check spamhaus lasso
> 
> I would like to use check_http or similar program to query the
Spamhaus
> Lasso tool to report if there are any open issues (this would benefit
alot
> of ISP/HSP's)...
> 
> Using aol for a generic example:  www.spamhaus.org
> <http://www.spamhaus.org>  -u /sbl/listings.lasso?isp=aol.com
> 
> If there are no SBL issues you receive the following text:
> 
> "There are no current SBL listings for aol.com <http://aol.com> "
> 
> If there are you recieve the following text:
> 
> "Found x SBL listings for IPs under the responsibility of aol.com"
> 
> 
> 
> So I would imagine my regular check command would be:
> 
> ./check_http -H www.spamhaus.org -u /sbl/listings.lasso?isp=aol.com -f
> follow -s no current SBL
> 
> or
> 
> 
> ./check_http -H www.spamhaus.org -u /sbl/listings.lasso?isp=aol.com -f
> follow -s "no current SBL"
> 
> 
> But neither seems to work...
> 
> 
> ./check_http -H www.spamhaus.org -u /sbl/listings.lasso?isp=aol.com -f
> follow -s no
> 
> works
> 
> but not:
> 
> ./check_http -H www.spamhaus.org -u /sbl/listings.lasso?isp=aol.com -f
> follow -s current
> 
> 
> I am at a loss now, can anyone make a suggestion on what i'm doin
wrong ?


It's not you directly. check_http isn't seeing the same page you are
seeing in your browser because they use cookies to track database
queries (to limit automated checks I would presume). You have already
accepted the cookie in your browser so you no longer see that page.
check_http however does not support cookies and therefore sees the
'cookie acceptance' page they present as opposed to the page you're
expecting. You can see this yourself by using wget or the LWP GET alias
if you have that installed --

$ GET "http://www.spamhaus.org/sbl/listings.lasso?isp=aol.com"

<html>

        <head>
                <title>The Spamhaus Project - Security</title>
                <link rel="stylesheet" href="../styles/sh.css"
type="text/css">

[chop]

                                        <td><span class="body">Spamhaus
uses Cookies to track database queries. We do not collect any
information from you, and any tests our servers perform are purely for
site security only. To accept our security measures, please press
"YES, I Agree".</span></td>

[chop]

It looks like wget can load a cookie from a pre-existing file so you
_may_ be able to create a wrapper for that to get the page then parse it
for the text you want but that's not very efficient and you may be using
the service in a manner that is undesirable to them. My suggestion would
be to contact them, propose what you want to do and see if they have an
alternative way of accessing the information.

--
Marc


-------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://ads.osdn.com/?ad_idv37&alloc_id865&op=click
_______________________________________________
Nagios-users mailing list
Nagios-users at lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/nagios-users
::: Please include Nagios version, plugin version (-v) and OS when reporting any issue. 
::: Messages without supporting info will risk being sent to /dev/null





More information about the Users mailing list