I have a small scrapy extension which looks into the stats object of a crawler and sends me an email if the crawler has thrown log messages of a certain type (e.g. WARNING, CRITICAL, ERROR).
These stats are accessible by the spiders stats object (crawler.stats.get_stats()), e.g.:
crawler.stats.get_stats().items()
[..]
'log_count/DEBUG': 9,
'log_count/ERROR': 2,
'log_count/INFO': 4,
[..]
If I run the spider on scrapinghub, the log stats are not there. There are a lot of other thins (e.g. exception count, etc..) but the log count is missing. Does someone know how to get them there or how to access them on scraping hub?
I've also checked the "Dumping Scrapy stats" values after a spider closes. If I run it on my machine the log count is there, If I run it on scrapinghub the log count is missing.