good day dear
Is there a way to specify Net::Telnet timeout with
WWW::Mechanize::Firefox? At the moment my internet connection is very slow and sometimes I get error with $mech->get():
command timed-out at /usr/local/share/perl/5.12.3/MozRepl/Client.pm line 186
$mech->repl->repl->timeout(100000);
Unfortunatly it does not work: Can’t locate object method “timeout” via package “MozRepl”
Documentation says this should:
$mech->repl->repl->setup_client( { extra_client_args => { timeout => 1 +80 } } );
**problem: **I have a list of 2500 websites and need to grab a thumbnail screenshot of them. How do I do that? I could try to parse the sites either with Perl.- Mechanize would be a good thing. Note: i only need the results as a thumbnails that are a maximum 240 pixels in the long dimension. At the moment i have a solution which is slow and does not give back thumbnails: How to make the script running faster with less overhead - spiting out the thumbnails
**Prerequisites: **addon/mozrepl/ the module WWW::Mechanize::Firefox; the module imager
What i have tried allready; here it is:
#!/usr/bin/perl
use strict;
use warnings;
use WWW::Mechanize::Firefox;
my $mech = new WWW::Mechanize::Firefox();
open(INPUT, "<urls.txt") or die $!;
while (<INPUT>) {
chomp;
print "$_
";
$mech->get($_);
my $png = $mech->content_as_png();
my $name = "$_";
$name =~s/^www\.//;
$name .= ".png";
open(OUTPUT, ">$name");
print OUTPUT $png;
sleep (5);
}
Well besides all this little handy scirpt does not care about the size: i only need some thumbnails - therefore i need a littel module - like Imager.
Putting this aside i think with that many url’s we have to expect that some will fail and handle that. For example, we can put the failed ones in an array or hash and retry them X times.
See the output commandline:
linux-vi17:/home/martin/perl # perl mecha_test_1.pl
www.google.com
www.cnn.com
www.msnbc.com
command timed-out at /usr/lib/perl5/site_perl/5.12.3/MozRepl/Client.pm line 186
linux-vi17:/home/martin/perl #
This is my source … see a snippet [example] of the sites i have in the url-list.
urls.txt [the list of sources ]
www.google.com
www.cnn.com
www.msnbc.com
news.bbc.co.uk
www.bing.com
www.yahoo.com
Question: how to extend the solution either to make sure that it does not stop in a time out. and - it does only store little thumbnails Note:again: i only need the results as a thumbnails that are a maximum 240 pixels in the long dimension. As a prerequisites, i allready have installed the module imager How to make the script running faster with less overhead - spiting out the thumbnails
**Update: ** in addition to there is a Monksthread perlmonks.org/?node_id=901572
**Again; **The question is: How to work around the timeout
to specify Net::Telnet timeout with WWW::Mechanize::Firefox? At the moment my internet connection is very slow and sometimes I get error with
$mech->get(): command timed-out at /usr/local/share/perl/5.10.1/MozRepl/Client.pm line 186
Perhaps i have to look after the mozrepl-Timeout-configuration!? But after all: This is weird and I don’t know where that timeout comes from. Maybe it really is Firefox timing out as it is busy synchronously fetching some result.
Well -
Net::Telnet
belongs to the Perl-core doesnt it!?
$ corelist Net::Telnet::Net::Telnet was not in CORE
If it really is
Net::Telnet,
then you’ll have to dive down:
$mech->repl->repl->client->{telnet}->timeout($new_timeout);
Love to hear from you! greetings