I'm trying to using scrapy to deploy my crawler project to a scrapyd instance but calling the commend returns the following error:
Server response (200): {"status": "error", "message": "AttributeError: 'NoneType' object has no attribute 'module_name'"}
Here's my setup.py to build the python egg submitted during deploy:
from setuptools import setup, find_packages
setup(
name = 'mycrawler',
version = '0.1',
packages = find_packages(),
install_requires = [
'scrapy',
'PyMongo',
'simplejson',
'queue'
]
)
My scrapy.cfg:
[settings]
default = mycrawler.settings
[deploy:scrapyd_home_vm]
url = http://192.168.1.2:6800/
project = mycrawler
[deploy:scrapyd_local_vm]
url = http://192.168.38.131:6800/
project = mycrawler
I get the feeling that this has to do with the way the egg is being built but I'm not sure. I know that python throws an error like this when an you access an attribute on what should be an object but for whatever reason is actually null. I also do not have anything with the "module_name" attribute or anything that tries to reference it in my own code. Running the crawler from scrapy locally works just fine but deploying the egg does not.