2

The site that I'm trying to scrape uses js to create a cookie. What I was thinking was that I can create a cookie in python and then use that cookie to scrape the site. However, I don't know any way of doing that. Does anybody have any ideas?

John Jiang
  • 11,069
  • 12
  • 51
  • 60
  • Won't this not work in general because a site may insert a digital signature in the cookie to make sure it came from the site? – Tom Jul 13 '09 at 02:25
  • 1
    It uses javascript to create the cookie so I know how it's created. – John Jiang Jul 13 '09 at 02:31

2 Answers2

2

Please see Python httplib2 - Handling Cookies in HTTP Form Posts for an example of adding a cookie to a request.

I often need to automate tasks in web based applications. I like to do this at the protocol level by simulating a real user's interactions via HTTP. Python comes with two built-in modules for this: urllib (higher level Web interface) and httplib (lower level HTTP interface).

Andrew Hare
  • 344,730
  • 71
  • 640
  • 635
2

If you want to do more involved browser emulation (including setting cookies) take a look at mechanize. It's simulation capabilities are almost complete (no Javascript support unfortunately): I've used it to build several scrapers with much success.

jkp
  • 78,960
  • 28
  • 103
  • 104
  • There are a few remote control browser solutions. I like selenium, particularly since I can run it in a virtual framebuffer Xwindows. (screenshots still work just fine.) Don't know much about the others, though. – Anders Eurenius Jul 13 '09 at 07:34
  • mechanize is not a browser automater, it emulates a browser at level of HTTP requests and responses. – jkp Jul 13 '09 at 10:10
  • This looks interesting, I'll take a look into it – John Jiang Jul 14 '09 at 18:30