0

I'm quite new to docker and I'm trying to create a scraper using Selenium Remote and Cloud Run Services based on this site.

I mainly have three problems (for now):

  1. The scraper is WAY slower, compared to running it locally. Is this normal? Is remote selenium way slower or I'm doing something wrong?

  2. I frequently find an error related to session timeout. How can I increase the session timeout?

  3. Everytime the script crashes, then I get this error:

    selenium.common.exceptions.WebDriverException: Message:

    401 Unauthorized

    Your client does not have permission to the requested URL /wd/hub/session

Or that the session of ID xxxxxxxxxxxxxxxx does not exists, when it shouldn't actually exists because it previosly crashed. Why I'm facing this errors and how can I solve this?

I can post my code if needed, but even with getting example.com with the browser I face this problems.

xerac
  • 147
  • 8

0 Answers0