2

when trying to execute this command:

scrapyd-deploy test -p project=myProject

I get the following error:

Traceback (most recent call last):
      File "/usr/bin/scrapyd-deploy", line 269, in <module>
        main()
      File "/usr/bin/scrapyd-deploy", line 95, in main
        egg, tmpdir = _build_egg()
      File "/usr/bin/scrapyd-deploy", line 236, in _build_egg
        retry_on_eintr(check_call, [sys.executable, 'setup.py', 'clean', '-a', 'bdist_egg', '-d', d], stdout=o, stderr=e)
      File "/usr/local/lib/python2.7/dist-packages/scrapy/utils/python.py", line 331, in retry_on_eintr
        return function(*args, **kw)
      File "/usr/lib/python2.7/subprocess.py", line 540, in check_call
        raise CalledProcessError(retcode, cmd)
    subprocess.CalledProcessError: Command '['/usr/bin/python', 'setup.py', 'clean', '-a', 'bdist_egg', '-d', '/tmp/scrapydeploy-wV3h4k']' returned non-zero exit status 1

I have installed scrapyd-deploy and scrapyd-client, as well as setuptools and scrapyd of course.

This command: python setup.py clean -a bdist_egg

produces the output:

running clean
removing 'build/lib.linux-x86_64-2.7' (and everything under it)
removing 'build/bdist.linux-x86_64' (and everything under it)
'build/scripts-2.7' does not exist -- can't clean it
removing 'build'
running bdist_egg
running egg_info
writing NOAA.egg-info/PKG-INFO
writing top-level names to NOAA.egg-info/top_level.txt
writing dependency_links to NOAA.egg-info/dependency_links.txt
writing entry points to NOAA.egg-info/entry_points.txt
reading manifest file 'NOAA.egg-info/SOURCES.txt'
writing manifest file 'NOAA.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
creating build
creating build/lib.linux-x86_64-2.7
creating build/lib.linux-x86_64-2.7/NOAA
copying NOAA/pipelines.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/__init__.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/noaa-template.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/noaa-original.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/settings-template.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/tika-python.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/settings.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/mysqldb.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/noaa.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/items.py -> build/lib.linux-x86_64-2.7/NOAA
creating build/lib.linux-x86_64-2.7/NOAA/spiders
copying NOAA/spiders/spider1.py -> build/lib.linux-x86_64-2.7/NOAA/spiders
copying NOAA/spiders/spider2.py -> build/lib.linux-x86_64-2.7/NOAA/spiders
(output omitted... There are a lot of spiders)

creating build/bdist.linux-x86_64
creating build/bdist.linux-x86_64/egg
creating build/bdist.linux-x86_64/egg/NOAA
copying build/lib.linux-x86_64-2.7/NOAA/pipelines.py -> build/bdist.linux-x86_64/egg/NOAA
creating build/bdist.linux-x86_64/egg/NOAA/spiders
copying build/lib.linux-x86_64-2.7/NOAA/spiders/spider1.py -> build/bdist.linux-x86_64/egg/NOAA/spiders
copying build/lib.linux-x86_64-2.7/NOAA/spiders/spider2.py -> build/bdist.linux-x86_64/egg/NOAA/spiders
(output omitted... There are a lot of spiders)


byte-compiling build/bdist.linux-x86_64/egg/NOAA/pipelines.py to pipelines.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/spiders/spider1.py to spider1.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/spiders/spider2.py to spider2.pyc
(output omitted... There are a lot of spiders)


byte-compiling build/bdist.linux-x86_64/egg/NOAA/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/noaa-template.py to noaa-template.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/noaa-original.py to noaa-original.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/settings-template.py to settings-template.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/tika-python.py to tika-python.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/settings.py to settings.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/mysqldb.py to mysqldb.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/noaa.py to noaa.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/items.py to items.pyc
creating build/bdist.linux-x86_64/egg/EGG-INFO
copying NOAA.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFO
copying NOAA.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying NOAA.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying NOAA.egg-info/entry_points.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying NOAA.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
zip_safe flag not set; analyzing archive contents...
creating 'dist/NOAA-1.0-py2.7.egg' and adding 'build/bdist.linux-x86_64/egg' to it
removing 'build/bdist.linux-x86_64/egg' (and everything under it)

When trying to schedule a spider with this command:

curl http://localhost:6800/schedule.json -d project=myproject -d spider=spider1.py

I get this error:

Traceback (most recent call last):
          File "/usr/local/lib/python2.7/dist-packages/twisted/web
            req.requestReceived(command, path, version)
          File "/usr/local/lib/python2.7/dist-packages/twisted/web
            self.process()
          File "/usr/local/lib/python2.7/dist-packages/twisted/web
            self.render(resrc)
          File "/usr/local/lib/python2.7/dist-packages/twisted/web
            body = resrc.render(self)
        --- <exception caught here> ---
          File "/usr/local/lib/python2.7/dist-packages/scrapyd/web
            return JsonResource.render(self, txrequest)
          File "/usr/local/lib/python2.7/dist-packages/scrapyd/uti
            r = resource.Resource.render(self, txrequest)
          File "/usr/local/lib/python2.7/dist-packages/twisted/web
            return m(request)
          File "/usr/local/lib/python2.7/dist-packages/scrapyd/web
            spiders = get_spider_list(project)
          File "/usr/local/lib/python2.7/dist-packages/scrapyd/uti
            runner = Config().get('runner')
          File "/usr/local/lib/python2.7/dist-packages/scrapyd/con
            self.cp.read(sources)
          File "/usr/lib/python2.7/ConfigParser.py", line 305, in
            self._read(fp, filename)
          File "/usr/lib/python2.7/ConfigParser.py", line 512, in
            raise MissingSectionHeaderError(fpname, lineno, line)
        ConfigParser.MissingSectionHeaderError: File contains no s
        file: /etc/scrapyd/conf.d/twistd.pid, line: 1
        '24262'

Funny thing is, if I add what I believe to be a section header ([section header])to that twistd.pid file, I get an error saying that it contains something other than a numerical value of the pid of twistd.

Are these issues interrelated?

Slug
  • 319
  • 1
  • 2
  • 12
  • what do you get as console output when you run `python setup.py clean -a bdist_egg` directly? – paul trmbrth Nov 22 '16 at 09:30
  • ok I edited my post to contain the output of "python setup.py clean -a bdist_egg" – Slug Nov 22 '16 at 14:44
  • So it seems to work. I don't know why scrapy-deploy has trouble then – paul trmbrth Nov 22 '16 at 14:46
  • yeah weird, I just added some more content when I try to post a spider, it tells me I need sections headers in the twistd.pid file. if I add them, it screams at me. Super confusing. – Slug Nov 22 '16 at 14:56

1 Answers1

0

I was getting the same error .. what worked for me was using the command with sudo.

The error might be because the command is not getting the proper permissions.

sudo scrapyd-deploy test -p project=myProject
Nandesh
  • 4,437
  • 2
  • 20
  • 26