2

My scrapy crawl command works well
But when I want to deploy scrapyd ,I met problems

scrapyd-deploy <target> -p <project>

I try on my mac and a remote server(centos),but both had error

Deploying to project "start" in http://localhost:6800/addversion.json
Server response (200):
{"status": "error", "message": "ImportError:  No module named project.models ", "node_name": "MacBook-Air.local"}

I think it's because scrapyd can't find django path

I use Django==1.7.10 Scrapy==1.0.3

Here is my structue

mysite
├── manage.py
├── project
│   ├── __init__.py
│   ├── models.py
│   ├── tests.py
│   └── views.py
└── mysite
│   ├── __init__.py
│   ├── settings.py
│   ├── urls.py
│   └── wsgi.py
│    
└── scrapypjt 
     └── things
            ├── scrapy.cfg
            ├── setup.py        
            └── things
                ├── __init__.py
                ├── settings.py
                ├── items.py
                └── pipelines.py   
                └── spiders

Here is my scrapy settings file :

import sys, os
django_path = os.path.join(os.path.dirname(__file__),"../../../")
sys.path.append(os.path.abspath(django_path))
os.environ['DJANGO_SETTINGS_MODULE'] = 'mysite.settings'

What else should I setting??

user2492364
  • 6,543
  • 22
  • 77
  • 147

1 Answers1

1

Go to ../settings.py and after imports add this: sys.path.append('Your full django project PATH')

For example mine was this one: sys.path.append('/home/sserb/scrapy-project/django_project')