0

I made changes to the spider to use some methods of the scrapinghub API and tried re-deploying it to Scrapy Cloud using "shub deploy". I'm getting an error: ImportError: No module named scrapinghub

It points to the import line in the spider

from scrapinghub import Connection

shub version 2.5.0 scrapinghub (1.9.0)

I'm able to run spider locally.

Any ideas what is the problem ?

Carlos Peña
  • 224
  • 2
  • 10
Zaky
  • 369
  • 6
  • 21
  • are you using any kind of virtual environment? – eLRuLL Dec 28 '16 at 16:08
  • yes i'm usingvirtualenv – Zaky Dec 28 '16 at 17:17
  • check that `shub` is inside your environment, check the path when using `which shub` in terminal – eLRuLL Dec 28 '16 at 17:18
  • shub and scrapinghub are inside my environment. I'm able to run spider locally using current virtual environment. – Zaky Dec 28 '16 at 17:58
  • 1
    try `shub deploy-reqs` if you have a `requirements.txt` file ready. – eLRuLL Dec 28 '16 at 18:04
  • Adding requirements.txt helped, but I got a warning from scrapycloud regarding deploying in such way. So I added the requirements.txt but deployed using standard shub deploy command - so it worked. Actually strange since according to docs scrapinghub is part of the hworker stack. Thank you ! – Zaky Dec 28 '16 at 18:20

1 Answers1

0

Adding requirements.txt with

# Scrapinghub
scrapinghub==1.9.0

and updating the scrapinghub.yml solved the issue. Thanks to Carlos Peña

Community
  • 1
  • 1
Zaky
  • 369
  • 6
  • 21