Questions tagged [python-requests]

USE ONLY FOR THE PYTHON REQUESTS LIBRARY. Requests is a full-featured Python HTTP library with an easy-to-use, logical API.

Official web site

Requests is an HTTP library written in Python under the Apache2 license.

It's meant to simplify HTTP 1.1 related tasks with several functionality out of the box. For example, there’s no need to manually add query strings to your URLs or to form-encode your POST data. Keep-alive and HTTP connection pooling are 100% automatic.

Features:

  • International Domains and URLs
  • Keep-Alive & Connection Pooling
  • Sessions with Cookie Persistence
  • Browser-style SSL Verification
  • Automatic Content Decoding
  • Basic/Digest Authentication
  • Elegant Key/Value Cookies
  • Automatic Decompression
  • Unicode Response Bodies
  • Multipart File Uploads
  • Streaming Downloads
  • Connection Timeouts
  • .netrc support
  • Chunked Requests
  • Python 2.6—3.8
  • Thread-safe.
21751 questions
336
votes
6 answers

Can I set max_retries for requests.request?

The Python requests module is simple and elegant but one thing bugs me. It is possible to get a requests.exception.ConnectionError with a message like: Max retries exceeded with url: ... This implies that requests can attempt to access the data…
Kirill Zaitsev
  • 4,511
  • 3
  • 21
  • 29
322
votes
4 answers

Using headers with the Python requests library's get method

So I recently stumbled upon this great library for handling HTTP requests in Python; found here http://docs.python-requests.org/en/latest/index.html. I love working with it, but I can't figure out how to add headers to my get requests. Help?
Breedly
  • 12,838
  • 13
  • 59
  • 83
321
votes
22 answers

Timeout for python requests.get entire response

I'm gathering statistics on a list of websites and I'm using requests for it for simplicity. Here is my code: data=[] websites=['http://google.com', 'http://bbc.co.uk'] for w in websites: r= requests.get(w, verify=False) data.append( (r.url,…
Kiarash
  • 7,378
  • 10
  • 44
  • 69
304
votes
13 answers

How to install packages offline?

What's the best way to download a python package and its dependencies from pypi for offline installation on another machine? Is there any easy way to do this with pip or easy_install? I'm trying to install the requests library on a FreeBSD box that…
Chris Drantz
  • 3,043
  • 3
  • 14
  • 4
294
votes
19 answers

Max retries exceeded with URL in requests

I'm trying to get the content of App Store > Business: import requests from lxml import html page = requests.get("https://itunes.apple.com/in/genre/ios-business/id6000?mt=8") tree = html.fromstring(page.text) flist = [] plist = [] for i in…
user3446000
  • 2,985
  • 3
  • 13
  • 9
262
votes
12 answers

Proxies with Python 'Requests' module

Just a short, simple one about the excellent Requests module for Python. I can't seem to find in the documentation what the variable 'proxies' should contain. When I send it a dict with a standard "IP:PORT" value it rejected it asking for 2…
user1064306
256
votes
3 answers

Python Request Post with param data

This is the raw request for an API call: POST http://192.168.3.45:8080/api/v2/event/log?sessionKey=b299d17b896417a7b18f46544d40adb734240cc2&format=json HTTP/1.1 Accept-Encoding: gzip,deflate Content-Type: application/json Content-Length: 86 Host:…
slysid
  • 5,236
  • 7
  • 36
  • 59
246
votes
9 answers

How to upload file with python requests?

I'm performing a simple task of uploading a file using Python requests library. I searched Stack Overflow and no one seemed to have the same problem, namely, that the file is not received by the server: import…
scichris
  • 2,617
  • 2
  • 13
  • 7
237
votes
16 answers

SSL InsecurePlatform error when using Requests package

Im using Python 2.7.3 and Requests. I installed Requests via pip. I believe it's the latest version. I'm running on Debian Wheezy. I've used Requests lots of times in the past and never faced this issue, but it seems that when making https requests…
Luke Peckham
  • 2,375
  • 2
  • 11
  • 11
236
votes
2 answers

Requests -- how to tell if you're getting a 404

I'm using the Requests library and accessing a website to gather data from it with the following code: r = requests.get(url) I want to add error testing for when an improper URL is entered and a 404 error is returned. If I intentionally enter an…
user1427661
  • 11,158
  • 28
  • 90
  • 132
223
votes
13 answers

Asynchronous Requests with Python requests

I tried the sample provided within the documentation of the requests library for python. With async.map(rs), I get the response codes, but I want to get the content of each page requested. This, for example, does not work: out = async.map(rs) print…
trbck
  • 5,187
  • 6
  • 26
  • 29
199
votes
7 answers

How could I use requests in asyncio?

I want to do parallel http request tasks in asyncio, but I find that python-requests would block the event loop of asyncio. I've found aiohttp but it couldn't provide the service of http request using a http proxy. So I want to know if there's a way…
flyer
  • 9,280
  • 11
  • 46
  • 62
194
votes
2 answers

What is the difference between 'content' and 'text'

I am using the terrific Python Requests library. I notice that the fine documentation has many examples of how to do something without explaining the why. For instance, both r.text and r.content are shown as examples of how to get the server…
dotancohen
  • 30,064
  • 36
  • 138
  • 197
186
votes
3 answers

How to send cookies in a post request with the Python Requests library?

I'm trying to use the Requests library to send cookies with a post request, but I'm not sure how to actually set up the cookies based on its documentation. The script is for use on Wikipedia, and the cookie(s) that need to be sent are of this…
Ricardo Altamirano
  • 14,650
  • 21
  • 72
  • 105
185
votes
9 answers

How to use Python requests to fake a browser visit a.k.a and generate User Agent?

I want to get the content from this website. If I use a browser like Firefox or Chrome I could get the real website page I want, but if I use the Python requests package (or wget command) to get it, it returns a totally different HTML page. I…
user1726366
  • 2,256
  • 4
  • 15
  • 17