0

I have tried with headers, cookies, Formdata and body too, but i got 401 and 500 status code. In this site First Page is in GET method & gives HTML response and further pages are in POST method & gives JSON response. But these status codes arrives for unauthorisation but i have searched and i couldn't find any CSRF token or auth token in web page headers.

import scrapy
from SouthShore.items import Product
from scrapy.http import Request, FormRequest

class OjcommerceDSpider(scrapy.Spider):
    handle_httpstatus_list = [401,500]
    name = "ojcommerce_d"
    allowed_domains = ["ojcommerce.com"]
    #start_urls = ['http://www.ojcommerce.com/search?k=south%20shore%20furniture']


    def start_requests(self):
        return [FormRequest('http://www.ojcommerce.com/ajax/search.aspx/FetchDataforPaging',
                        method ="POST",
                        body = '''{"searchTitle" : "south shore furniture","pageIndex" : '2',"sortBy":"1"}''',
                        headers={'Content-Type': 'application/json; charset=UTF-8', 'Accept' : 'application/json, text/javascript, */*; q=0.01',
                                 'Cookie' :'''vid=eAZZP6XwbmybjpTWQCLS+g==;
                                              _ga=GA1.2.1154881264.1480509732;
                                              ASP.NET_SessionId=rkklowbpaxzpp50btpira1yp'''},callback=self.parse)]

    def parse(self,response):
        with open("ojcommerce.json","wb") as f:
            f.write(response.body)
Vimal Annamalai
  • 139
  • 1
  • 2
  • 12

1 Answers1

0

I got it working with the following code:

import json

from scrapy import Request, Spider


class OjcommerceDSpider(Spider):
    name = "ojcommerce"
    allowed_domains = ["ojcommerce.com"]
    custom_settings = {
        'LOG_LEVEL': 'DEBUG',
        'COOKIES_DEBUG': True,
        'DEFAULT_REQUEST_HEADERS': {
            'User-Agent': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.82 Safari/537.36',
        },
    }

    def start_requests(self):
        yield Request(
            url='http://www.ojcommerce.com/search?k=furniture',
            callback=self.parse_search_page,
        )

    def parse_search_page(self, response):
        yield Request(
            url='http://www.ojcommerce.com/ajax/search.aspx/FetchDataforPaging',
            method='POST',
            body=json.dumps({'searchTitle': 'furniture', 'pageIndex': '2', 'sortBy': '1'}),
            callback=self.parse_json_page,
            headers={
                'Content-Type': 'application/json; charset=UTF-8',
                'Accept': 'application/json, text/javascript, */*; q=0.01',
                'X-Requested-With': 'XMLHttpRequest',
            },
        )

    def parse_json_page(self,response):
        data = json.loads(response.body)
        with open('ojcommerce.json', 'wb') as f:
            json.dump(data, f, indent=4)

Two observations:

  • a previous request to another site page is needed to get a "fresh" ASP.NET_SessionId cookie
  • I couldn't make it work using FormRequest, use Request instead.
elacuesta
  • 891
  • 5
  • 20