3

Consider the following "works for me" Curl command:

curl http://192.168.2.131:6800/schedule.json -d project=a -d spider=b.

I have no idea how to execute this POST in Paw. The curl importer in Paw transforms this to one body parameter: project=a&spider=b Which the curl exporter translates to: curl -X "POST" "http://192.168.2.131:6800/schedule.json" \ -H "Content-Type: text/plain" \ -d "project=a&spider=b"

However the server part does not accept either Paw or the exported Curl command unfortunately. It needs two -d parameters. In case you are wondering which thing accepts these commands it is the Scrapy Daemon: scrapyd

So the question is how can I execute two (or more) -d parameters in Paw?

Pullie
  • 2,685
  • 3
  • 25
  • 31

1 Answers1

4

If you set separate -d parameters with curl, you'll automatically instruct curl to add the correct Content-Type: application/x-www-form-urlencoded header. In the second "combined" curl command that doesn't work for you, you explicitly set the content type to plain text, which I assume is not what scrapyd is expecting.

In Paw, make sure you use Form URL-encoded and add both parameters separately in the "Body" construction panel.

enter image description here

Ivo Janssen
  • 476
  • 3
  • 9
  • Darn. You are absolutely right. It works. Curl magic has once more got me. – Pullie Apr 03 '16 at 16:55
  • 2
    Thanks for reporting this, Pullie, and thanks for providing the right workaround, Ivo. After some tests, I see that our curl importer was wrong, as it converts -d/--data curl options into plain text in Paw, when it should directly set a "Form URL-Encoded" body. I've made a ticket on our repo so we will fix that asap https://github.com/luckymarmot/Paw-cURLImporter/issues/15 – Micha Mazaheri Apr 03 '16 at 17:37
  • @MichaMazaheri: Great that it will be fixed! – Pullie Apr 03 '16 at 18:53