1

I am learning Symfony Crawler and want to implement the following scenario:

  1. navigate to a certain page
  2. fill out a form with data
  3. submit the form
  4. if you are redirected to example.org/error after submitting, try submitting the form again with different data, and if to another page, download its content.

My code is quite simple (it runs as Symfony Command):

    $io = new SymfonyStyle($input, $output);

    $url = "https://example.org";
    $browser = new HttpBrowser(HttpClient::create());
    $crawler = $browser->request('GET', $url);
    $form = $crawler->selectButton('submitForm')->form();

    $form['field1'] = 'foo'; //temporary data
    $form['field2'] = 'bar';
    $form['field3'] = 'baz';
    $form['field4'] = '12345';
    $crawler = $browser->submit($form);
    if(!$crawler->getBaseHref() === "https://example.org/error")
    {
        $nodeValues = $crawler->filter('.section')->each(function (Crawler $node, $i) {
            return $node->text();
        });

        $io->info("Important message: ".$nodeValues[0]);

    }

After running it, 95% of the time I get the following error message in commandline:

In ErrorChunk.php line 62: OpenSSL SSL_read: Connection was reset, errno 10054 for "https://example.org".

In CurlResponse.php line 337: OpenSSL SSL_read: Connection was reset, errno 10054 for "https://example.org".

On the other hand, in the remaining 5% of cases I get correct data. Is this some kind of protection against using the crawler? Or do I need to set something else?

SuperDuper
  • 19
  • 8
TeslaX93
  • 117
  • 1
  • 11

1 Answers1

1

You might have an old version of OpenSSL. Try downloading a new package or compiling from source. Most servers require TLS1.2+ because it is the current standard.

Natsuki.HX
  • 21
  • 5