4

I have ran across an issue in which my Lua script refuses to execute. The returned response from the ScrapyRequest call seems to be an HTML body, while i'm expecting a document title. I am assuming that the Lua script is never being called as it seems to have no apparent effect on the response. I have dug a lot through the documentation and can't quite seem to figure out what is missing here. Does anyone have any suggestions?

from urlparse import urljoin

import scrapy
from scrapy_splash import SplashRequest


GOOGLE_BASE_URL = 'https://www.google.com/'
GOOGLE_QUERY_PARAMETERS = '#q={query}'
GOOGLE_SEARCH_URL = urljoin(GOOGLE_BASE_URL, GOOGLE_QUERY_PARAMETERS)

GOOGLE_SEARCH_QUERY = 'example search query'


LUA_SCRIPT = """
function main(splash)
    assert(splash:go(splash.args.url))
    return splash:evaljs("document.title")
end
"""

SCRAPY_CRAWLER_NAME = 'google_crawler'
SCRAPY_SPLASH_ENDPOINT = 'render.html'
SCRAPY_ARGS = {
    'lua_source': LUA_SCRIPT
}


def get_search_url(query):
    return GOOGLE_SEARCH_URL.format(query=query)


class GoogleCrawler(scrapy.Spider):
    name=SCRAPY_CRAWLER_NAME
    search_url = get_search_url(GOOGLE_SEARCH_QUERY)

    def start_requests(self):

        response = SplashRequest(self.search_url,
            self.parse, endpoint=SPLASH_ENDPOINT, args=SCRAPY_ARGS)

        yield response


    def parse(self, response):
        doc_title = response.body_as_unicode()
        print doc_title
Gallaecio
  • 3,620
  • 2
  • 25
  • 64

2 Answers2

11

'endpoint' argument of SplashRequest must be 'execute' in order to execute a Lua script; it is 'render.html' in the example.

Mikhail Korobov
  • 21,908
  • 8
  • 73
  • 65
1
 LUA_SCRIPT = """
    function main(splash)
      assert(splash:go(splash.args.url))
      return title = splash:evaljs("document.title")
    end
    """

 def start_requests(self):

   SplashRequest(self.search_url,self.parse, endpoint='execute',args=SCRAPY_ARGS)

You can recover the value with response.data['title']