I am learning Symfony Crawler and want to implement the following scenario:
- navigate to a certain page
- fill out a form with data
- submit the form
- if you are redirected to example.org/error after submitting, try submitting the form again with different data, and if to another page, download its content.
My code is quite simple (it runs as Symfony Command):
$io = new SymfonyStyle($input, $output);
$url = "https://example.org";
$browser = new HttpBrowser(HttpClient::create());
$crawler = $browser->request('GET', $url);
$form = $crawler->selectButton('submitForm')->form();
$form['field1'] = 'foo'; //temporary data
$form['field2'] = 'bar';
$form['field3'] = 'baz';
$form['field4'] = '12345';
$crawler = $browser->submit($form);
if(!$crawler->getBaseHref() === "https://example.org/error")
{
$nodeValues = $crawler->filter('.section')->each(function (Crawler $node, $i) {
return $node->text();
});
$io->info("Important message: ".$nodeValues[0]);
}
After running it, 95% of the time I get the following error message in commandline:
In ErrorChunk.php line 62: OpenSSL SSL_read: Connection was reset, errno 10054 for "https://example.org".
In CurlResponse.php line 337: OpenSSL SSL_read: Connection was reset, errno 10054 for "https://example.org".
On the other hand, in the remaining 5% of cases I get correct data. Is this some kind of protection against using the crawler? Or do I need to set something else?