I'm creating a scraper on google colab using Selenium but now doesnt work. Yes in the past but I dont know why now doesn't.
The code is:
#dependencies
!pip install selenium
!apt-get update
!apt install chromium-chromedriver
!pip install fake-useragent
from selenium import webdriver
from fake_useragent import UserAgent
#options to chromedriver
ua = UserAgent()
userAgent = ua.random
chrome_options = webdriver.ChromeOptions()
chrome_options.add_argument('--headless')
chrome_options.add_argument('--no-sandbox')
chrome_options.add_argument('--disable-dev-shm-usage')
driver = webdriver.Chrome('chromedriver',chrome_options=chrome_options)
chrome_options.add_argument('--user-agent="'+userAgent+'"')
When a I run this code, colab show the next error:
"Message: Service chromedriver unexpectedly exited. Status code was: 1"
Do you know any solution? I've looked for an answer on other related topics but nothing works for me
Run Selenium on Google Colab and scrape organic results