1

I'm currently using Scrapinghub's Scrapy Cloud to host my 12 spiders (and 12 differnet projects).

I'd like to have one folder with functions that are used by all 12 spiders but not sure what the best way to implement it without having 1 functions folder in each spider.

I'm thinking about hosting all spiders under the same project, creating a private package in the cloud the spiders connect to or hosting ScrapyD myself so I can reference the modules.

Has anyone stumbled upon this and what was your solution?

Axel Eriksson
  • 105
  • 1
  • 11
  • 1
    And just when one asks the question they find the answer - https://support.scrapinghub.com/support/solutions/articles/22000200417-deploying-private-dependencies – Axel Eriksson Apr 02 '18 at 12:17

0 Answers0