I am try to learn how scrapy FormRequest
works on website,I have the following scrapy code:
import scrapy
import json
from scrapy.utils.response import open_in_browser
class Test(scrapy.Spider):
name = 'go2'
def start_requests(self):
url = 'http://www.webscrapingfordatascience.com/jsonajax/results2.php'
payload = {'api_code': 'C123456'}
yield scrapy.FormRequest(url,formdata=json.dumps(payload),headers={'Content-Type': 'application/json'})
#yield scrapy.FormRequest(url,formdata=payload) #dict opbject not allowed ?
def parse(self,response):
#print(response.text)
open_in_browser(response)
I can't seem to get to right response,I first tried using dictionary but it didn't work, then I tested with requests
as following and both my tries works.
import requests
url = 'http://www.webscrapingfordatascience.com/jsonajax/results2.php'
payload={'api_code': 'C123456'}
res = requests.post(url, json=payload)
res2 = requests.post(url, data=json.dumps(payload))
#res3 = requests.post(url, data=payload) doesn't work
FormRequest
takes in key,value not string which is why json.dumps()
is throwing an error. My question is How can I get FormRequest (or any scrapy methods) to work on this example i.e get the same results as requests.
I believe res3 = requests.post(url, data=payload)
is the same as FormRequest(url,formdata=payload)
which is why it is not working.