I tried installing chromium and selenium in a colab notebook to be able to run scrapers.
# install chromium, its driver, and selenium
!apt update
!apt install chromium-chromedriver
!pip install selenium
# set options to be headless, ..
from selenium import webdriver
options = webdriver.ChromeOptions()
options.add_argument('--headless')
options.add_argument('--no-sandbox')
options.add_argument('--disable-dev-shm-usage')
# open it, go to a website, and get results
wd = webdriver.Chrome(options=options)
wd.get("https://www.website.com")
print(wd.page_source) # results
This worked fine a week ago but now I'm getting these errors
Message: Service chromedriver unexpectedly exited. Status code was: 1`
WebDriverException Traceback (most recent call last) <ipython-input-1-b4e39b015b78> in <module>
10 options.add_argument('--disable-dev-shm-usage')
11 # open it, go to a website, and get results
---> 12 wd = webdriver.Chrome(options=options)
13 wd.get("https://www.website.com")
14 print(wd.page_source) # results
3 frames /usr/local/lib/python3.8/dist-packages/selenium/webdriver/common/service.py in assert_process_still_running(self)
117 return_code = self.process.poll()
118 if return_code:
--> 119 raise WebDriverException(f"Service {self.path} unexpectedly exited. Status code was: {return_code}")
120
121 def is_connectable(self) -> bool:
WebDriverException: Message: Service chromedriver unexpectedly exited. Status code was: 1