0

when I use scrapy shell on PyPy ,it's throw some Execption ,so what's this error? This is error information

    % /usr/local/share/pypy/scrapy shell http://www.baidu.com             
    zsh: correct 'shell' to 'shells' [nyae]? n
    2012-11-09 16:40:06+0800 [scrapy] INFO: Scrapy 0.16.1 started (bot: scrapybot)
    2012-11-09 16:40:06+0800 [scrapy] DEBUG: Enabled extensions: TelnetConsole, WebService, CloseSpider, CoreStats, SpiderState
    2012-11-09 16:40:06+0800 [scrapy] DEBUG: Enabled downloader middlewares: HttpAuthMiddleware, DownloadTimeoutMiddleware, UserAgentMiddleware, RetryMiddleware, DefaultHeadersMiddleware, RedirectMiddleware, CookiesMiddleware, HttpCompressionMiddleware, ChunkedTransferMiddleware, DownloaderStats
    2012-11-09 16:40:06+0800 [scrapy] DEBUG: Enabled spider middlewares: HttpErrorMiddleware, OffsiteMiddleware, RefererMiddleware, UrlLengthMiddleware, DepthMiddleware
    2012-11-09 16:40:06+0800 [scrapy] DEBUG: Enabled item pipelines: 
    2012-11-09 16:40:06+0800 [scrapy] ERROR: Error caught on signal handler: <bound method instance.start_listening of <scrapy.telnet.TelnetConsole instance at 0x00000001063f0bc0>>
        Traceback (most recent call last):
          File "/usr/local/Cellar/pypy/1.9/site-packages/twisted/internet/defer.py", line 1045, in _inlineCallbacks
            result = g.send(result)
          File "/usr/local/Cellar/pypy/1.9/site-packages/scrapy/core/engine.py", line 75, in start
            yield self.signals.send_catch_log_deferred(signal=signals.engine_started)
          File "/usr/local/Cellar/pypy/1.9/site-packages/scrapy/signalmanager.py", line 23, in send_catch_log_deferred
            return signal.send_catch_log_deferred(*a, **kw)
          File "/usr/local/Cellar/pypy/1.9/site-packages/scrapy/utils/signal.py", line 53, in send_catch_log_deferred
            *arguments, **named)
        --- <exception caught here> ---
          File "/usr/local/Cellar/pypy/1.9/site-packages/twisted/internet/defer.py", line 134, in maybeDeferred
            result = f(*args, **kw)
          File "/usr/local/Cellar/pypy/1.9/site-packages/scrapy/xlib/pydispatch/robustapply.py", line 47, in robustApply
            return receiver(*arguments, **named)
        exceptions.TypeError: start_listening() got 2 unexpected keyword arguments
royisme
  • 11
  • 1

1 Answers1

0

As far as I can see, scrapy uses lxml. Only very new lxml works (with also PyPy trunk, not PyPy 1.9), I would suggest trying it instead.

fijal
  • 3,190
  • 18
  • 21
  • thanks very much, but I use python run the scrapy is ok. so I check the pip freeze , pypy and python all use the lxml 3.0.1 . But I will try it. – royisme Nov 09 '12 at 16:17