2

I am facing a problem with crawler processes dying unexpectedly.

I am using scrapy 0.14, the problem existed in 0.12 as well .

The scrapyd log shows entries like: Process died: exitstatus=None The spider logs dont show spider closed information as depicted by my database status also.

Has anybody else faced similar situation? How can i trace the reason for these processes vanishing, any ideas, suggestions?

Umar
  • 2,819
  • 20
  • 17
Jeff Borden
  • 1,369
  • 1
  • 19
  • 30
  • You should [report this to the devs](https://github.com/scrapy/scrapy/issues) if there is no proper logging for such a case. – DrColossos Apr 13 '12 at 15:25

1 Answers1

0

I think i had a similar situation.

The reason that processes were dying was that spiders were generating an exception making the process to stop.

To find out the exception look at the log files somewhere in .scrapy folder. For each started crawler process scrapy creates a log file with job id in its name.

warvariuc
  • 57,116
  • 41
  • 173
  • 227