-1

I try to use Perl Mojo::UserAgent to crawle a website, but the server checking the browser's Javascript support capability. It is possible to say / lie my client support Javascript?

Now I have only this code:

use Mojo::DOM;
use Mojo::UserAgent;

my $ua = Mojo::UserAgent->new;
$ua = $ua->transactor(Mojo::UserAgent::Transactor->new);
$ua->transactor->name('Mozilla/5.0 (Macintosh; Intel Mac OS X 10.8; rv:21.0) Gecko/20100101 Firefox/21.0');
my $tx = $ua->post("http://..." => form => {
                    "login_username" => "...",
                    "login_password" => "..."
            });

It is possible to do? If not possible, is there exists any other perl-based user-agent what can do this?

netdjw
  • 5,419
  • 21
  • 88
  • 162
  • Odds are that the server is doing no such check. It probably just returns a webpage which says you don't support JS and some JS which gets rid of that message. The reason you see the message is because you aren't parsing it. – Quentin Sep 21 '15 at 15:58
  • @Quentin so there is no way to see same content what I see in my desktop Google Chrome or Firefox? – netdjw Sep 21 '15 at 16:00

1 Answers1

-1

There are two basic approaches to scraping data from a website that depends on JavaScript.

  1. Reverse engineer the website. Figure out what the JavaScript is doing, then replicate that functionality in your own code.
  2. Drive a web browser which supports JavaScript (e.g. via Selenium or PhantomJS — both of which have CPAN modules) and extract the data from the browser.
Quentin
  • 914,110
  • 126
  • 1,211
  • 1,335