0

When attempting to deploy my project to scrapyd using the following command:

scrapyd-deploy test2 -p NOAA

or, when attempting to run my spider that I created:

curl http://localhost:6800/schedule.json -d project=test -d spider=myspider

I get the following output:

Packing version 1478644677 
Deploying to project "test" in http://localhost:6800/addversion.json 
Server response (200):
{"status": "error", "message": "Use \"scrapy\" to see available
commands",  "node_name": "osboxes"}

The following files appear empty when I check them:

/var/log/scrapyd/scrapyd.log 
/var/log/scrapyd/scrapyd.out
/var/log/scrapyd/scrapyd.err

I referenced this answer :on deploying egg file in scrapyd server then {"status": "error", "message": "IndexError: list index out of range"}

But it is a different output, he had an index problem.

With my error I think something is wrong with the spider itself? IT appears that it deploys, but it can't initiate a spider.

I installed scrapyd as instructed. This is my scrapy.cfg file:

[settings]
default = NOAA.settings

[deploy:test2] 
url = http://localhost:6800/ 
project = NOAA

running ubuntu 14.04

Community
  • 1
  • 1
Slug
  • 319
  • 1
  • 2
  • 12
  • I use: cd myprojectdir/myspider/ && sudo scrapyd-deploy default -p NOAA – Dan-Dev Nov 08 '16 at 22:57
  • Even when I navigate to my spider folder and run that command, i get the same error. I did change the [deploy:test2] in my scrapy.cfg to just [deploy] so that the default will work. Same error. – Slug Nov 08 '16 at 23:02
  • My scrapy.cfg file looks like this but on 5 lines: [settings] default = myspider.settings [deploy] url = http://localhost:6800/ project = myspider – Dan-Dev Nov 08 '16 at 23:21
  • You are correct. Sorry about that. My scrapy.cfg file is similar. I edited my original question to have that information. – Slug Nov 08 '16 at 23:27

1 Answers1

0

Found the solution!

My /etc/scrapyd/conf.d/000-default (the configuration file) was empty for some reason...

1) I filled the file with this information from: http://scrapyd.readthedocs.io/en/stable/config.html#config

[scrapyd]
eggs_dir    = eggs
logs_dir    = logs
items_dir   =
jobs_to_keep = 5
dbs_dir     = dbs
max_proc    = 0
max_proc_per_cpu = 4
finished_to_keep = 100
poll_interval = 5
http_port   = 6800
debug       = off
runner      = scrapyd.runner
application = scrapyd.app.application
launcher    = scrapyd.launcher.Launcher

[services]
schedule.json     = scrapyd.webservice.Schedule
cancel.json       = scrapyd.webservice.Cancel
addversion.json   = scrapyd.webservice.AddVersion
listprojects.json = scrapyd.webservice.ListProjects
listversions.json = scrapyd.webservice.ListVersions
listspiders.json  = scrapyd.webservice.ListSpiders
delproject.json   = scrapyd.webservice.DeleteProject
delversion.json   = scrapyd.webservice.DeleteVersion
listjobs.json     = scrapyd.webservice.ListJobs

2) Restarted the service by doing ctrl + c on the terminal window running scrapyd, then typing the command scrapyd.

3) Re-ran the previously stated command (sudo is important):

sudo scrapyd-deploy test -p NOAA

4) success!

{"status": "ok", "project": "NOAA", "version": "1478702964", "spiders": 3, "node_name": "osboxes"}
Slug
  • 319
  • 1
  • 2
  • 12