This post comes from another post Scrapy encounters DEBUG: Crawled (400).
I'm not sure whether the WARNING is blocking my crawler.
I'm using Python 3.7.7 on macOS 10.13 with Scrapy 2.1.0 and scrapy-user-agents 0.1.1
I've already added DOWNLOADER_MIDDLEWARES and DEFAULT_REQUEST_HEADERS
DOWNLOADER_MIDDLEWARES = { 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware': None, 'scrapy_user_agents.middlewares.RandomUserAgentMiddleware': 400, }
DEFAULT_REQUEST_HEADERS = {'user-agent':'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.116 Safari/537.36'}
I ran this command in terminal and I got some available Scrapy objects, which means Scrapy and Middleware work well.
scrapy shell 'https://dictionary.cambridge.org/us/dictionary/english/grammar'
In the meanwhile, I got some warnings
2020-07-01 13:27:48 [scrapy_user_agents.user_agent_picker] WARNING: [UnsupportedBrowserType] Family: Android
2020-07-01 13:27:49 [scrapy_user_agents.user_agent_picker] WARNING: [UnsupportedDeviceType] Family: Other, Brand: None, Model: None
...
2020-07-01 13:27:49 [scrapy_user_agents.user_agent_picker] WARNING: [UnsupportedDeviceType] Family: LG Web0S SmartTV, Brand: LG, Model: Web0S SmartTV
...
2020-07-01 13:27:51 [scrapy_user_agents.user_agent_picker] WARNING: [UnsupportedBrowserType] Family: SMTBot
2020-07-01 13:27:51 [scrapy_user_agents.user_agent_picker] WARNING: [UnsupportedBrowserType] Family: PhantomJS
...
What does the WARNING mean?
If they are not fatal, how do I silence them?