Nagios-users Digest, Vol 1, Issue 3212

Stanley.Hopcroft at Dest.gov.au Stanley.Hopcroft at Dest.gov.au
Thu Jun 8 01:38:31 CEST 2006


Dear Folks,

I am writing to thank you for your letter and say,

>-----Original Message-----
>Message: 1
>Date: Wed, 7 Jun 2006 14:27:36 +0200
>From: Rene Fertig <me at renux.de>
>Subject: Re: [Nagios-users] How to monitor complex websites?
>To: nagios-users at lists.sourceforge.net
>Message-ID: <200606071427.37695.me at renux.de>
>Content-Type: text/plain;  charset="iso-8859-1"
>
>check_http version 1.89 (which comes with nagios-plugins 
>1.4.3) can set a 
>User-Agent-String:
>
> -A, --useragent=STRING
>   String to be sent in http header as "User Agent"
>

  ... snip

>But probably you should make your own plugin if you need 
>special cookie 
>support.
>
>bye, Rene
>

You may want to revisit writing your own, since there's a new
CPAN module FEAR::API for fearless programming of web clients.

>From http://www.perl.com/lpt/a/2006/06/01/fear-api.html

'
FEAR::API's documentation says:

FEAR::API is a tool that helps reduce your time creating site scraping
scripts and helps you do it in an much more elegant way. FEAR::API
combines many strong and powerful features from various CPAN modules,
such as LWP::UserAgent, WWW::Mechanize, Template::Extract, Encode,
HTML::Parser, etc., and digests them into a deeper Zen.

(Here's an example that

 Fetch CPAN's homepage. 
 Extract data with a template. 
 Process links using a control structure. 
 Print fetched content to STDOUT. 
 Dump links in the page. 
 Use YAML to print extract results
)

It might be best to introduce FEAR::API by rewriting the previous
example:

   1    use FEAR::API -base;
   2    url("search.cpan.org");
   3    fetch >> [
   4      qr(foo) => _feedback,
   5      qr(bar) => \my @link,
   6      qr()    => sub { 'do something here' }
   7    ];
   8    fetch while has_more_links;
   9    extmethod('Template::Extract');
  10    extract($template);
  11    print Dumper extresult;
  12    print document->as_string;
  13    print Dumper \@link;
  14    invoke_handler('YAML');
'

The article compares FEAR::API with the former standards WWW::Mechanize.

Even if you decide that FEAR::API, the standard Perl HTTP modules do

cookies
parse HTML - in partic, extract links
handle fill out forms

HTH,

Yours sincerely.


_______________________________________________
Nagios-users mailing list
Nagios-users at lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/nagios-users
::: Please include Nagios version, plugin version (-v) and OS when reporting any issue. 
::: Messages without supporting info will risk being sent to /dev/null





More information about the Users mailing list