42

I'm building an "API API", it's basically a wrapper for a in house REST web service that the web app will be making a lot of requests to. Some of the web service calls need to be GET rather than post, but passing parameters.

Is there a "best practice" way to encode a dictionary into a query string? e.g.: ?foo=bar&bla=blah

I'm looking at the urllib2 docs, and it looks like it decides by itself wether to use POST or GET based on if you pass params or not, but maybe someone knows how to make it transform the params dictionary into a GET request.

Maybe there's a package for something like this out there? It would be great if it supported keep-alive, as the web server will be constantly requesting things from the REST service.

Ideally something that would also transform the XML into some kind of traversable python object.

Thanks!

adamJLev
  • 13,713
  • 11
  • 60
  • 65

3 Answers3

34

Is urllib.urlencode() not enough?

>>> import urllib
>>> urllib.urlencode({'foo': 'bar', 'bla': 'blah'})
foo=bar&bla=blah

EDIT:

You can also update the existing url:

  >>> import urlparse, urlencode
  >>> url_dict = urlparse.parse_qs('a=b&c=d')
  >>> url_dict
  {'a': ['b'], 'c': ['d']}
  >>> url_dict['a'].append('x')
  >>> url_dict
  {'a': ['b', 'x'], 'c': ['d']}
  >>> urllib.urlencode(url_dict, True)
  'a=b&a=x&c=d'

Note that parse_qs function was in cgi package before Python 2.6

EDIT 23/04/2012:

You can also take a look at python-requests - it should kill urllibs eventually :)

Tomasz Zieliński
  • 16,136
  • 7
  • 59
  • 83
  • I was hoping there would be something that is smart and knows how to add variables to a URL with already some query string params. e.g. given a url like `example.com?foo=bar`, if you do the string appending yourself, you have to worry about not duplicating the question mark and all that. But I guess that's left as an exercise to the developer – adamJLev May 12 '10 at 13:47
  • +1 For mentioning python requests. Thank you, you just saved my day!! Requests is super awesome, so much better than the complicated urllib and urllib2. I hope everyone stops using urllib and urllib2 in favour of requests... – theQuestionMan May 19 '21 at 01:37
18

urllib.urlencode

And yes, the urllib / urllib2 division of labor is a little confusing in Python 2.x.

msw
  • 42,753
  • 9
  • 87
  • 112
  • Yes it is.. do you have any suggestions on what to use for the XML parsing? There seems to be a variety of builtin packages for xml, not sure which one is more appropriate. Lightweight wins in this case. Thx – adamJLev May 11 '10 at 22:48
  • 1
    I've only ever used BeautifulStoneSoup for parsing XML because it just works (once you learn the model). http://www.crummy.com/software/BeautifulSoup/ – msw May 11 '10 at 22:50
  • 4
    There is minidom, but from my limited experience I can say that lxml is a way to go – Tomasz Zieliński May 11 '10 at 22:51
2
import urllib
data = {}
data["val1"] = "VALUE1"
data["val2"] = "VALUE2"
data["val3"] = "VALUE3"
url_values = urllib.urlencode(data)
url = "https://www.python.org"
print url + "?" + url_values

The url_values is an encoded values and can be used to post to the server along with url as a query string(url+"?"+url_values).

roschach
  • 8,390
  • 14
  • 74
  • 124
  • You should try to explain how this is an answer. If this does truly provide an answer for the question, how does it do this? What did you do here to make this valid and resolve the issue? – Brandon Buck Apr 16 '15 at 05:13