On Mon, Feb 23, 2009 at 6:36 AM, samira khansha
<samira.khansha at gmail.com> wrote:> I`m using wwwMechanize to download a list of urls,when I run this program,
> when it is getting an url that doesn`t exist
> it stops getting rest of urls , would you please tell me how can I solve
> this problem?
>
> use strict;
> use WWW::Mechanize;
> use File::Basename;
> use HTML::Parser;
> use Crypt::SSLeay;
> #use LWP::Debug qw(+);
> open F,"urlList.txt" or die "I can not open it:$!";
> my $mech = WWW::Mechanize->new( autocheck => 1 );
> my @array = <F>;
> foreach my $link(@array){
> my $url = $link;
> $link =~ s/\///g;
> my $filename = $link;
> $mech->get( $url);
> $mech->save_content( $filename );
>
> print $filename."\n";
> }
I think you''ve got the wrong list. This is the Ruby mechanize list,
and you want the perl mechanize list.
--
Aaron Patterson
http://tenderlovemaking.com/