1

Using Crawlera's sample code for a GET request with a proxy.

import requests

url = "http://httpbin.org/ip"
proxy_host = "proxy.crawlera.com"
proxy_port = "8010"
proxy_auth = "<APIKEY>:" # Make sure to include ':' at the end
proxies = {
      "https": "https://{}@{}:{}/".format(proxy_auth, proxy_host, proxy_port),
      "http": "http://{}@{}:{}/".format(proxy_auth, proxy_host, proxy_port)
}

r = requests.get(url, proxies=proxies, verify=False)

I get a 407 Bad Proxy Auth error. I've tripled check that the API_KEY is correct.

Response headers:

{
   'Proxy-Connection': 'close',
   'Proxy-Authenticate': 'Basic realm="Crawlera"',
   'Transfer-Encoding': 'chunked',
   'Connection': 'close',
   'Date': 'Mon, 26 Mar 2018 11:18:05 GMT',
   'X-Crawlera-Error': 'bad_proxy_auth',
   'X-Crawlera-Version': '1.32.0-07c786'
}

Requests is already updated.

$ pip freeze |grep requests
requests==2.8.1
Joseph D.
  • 11,804
  • 3
  • 34
  • 67

2 Answers2

0

I managed to get it working by adding a Proxy-Authorization header.

proxy_auth  = "<APIKEY>:"
headers = {
   # other headers ...
   "Proxy-Authorization": 'Basic ' + base64.b64encode(proxy_auth)
}

proxies = {
      "https": "https://{}:{}/".format(proxy_host, proxy_port),
      "http": "http://{}:{}/".format(proxy_host, proxy_port)
}


r = requests.get(url, headers=headers, proxies=proxies, verify=False)
Joseph D.
  • 11,804
  • 3
  • 34
  • 67
  • 1
    Doesn't work, gives the error message `TypeError: a bytes-like object is required, not 'str'` for the `base64.b64encode(proxy_auth)` code bit – robertspierre Jun 08 '20 at 15:24
0

If you want to retain the 'crawlera' way you can try upgrading your requests client:

pip install requests --upgrade

I had the same problem, and your solution worked, but then after further searches I found this solution:

Upgrading to requests client to 2.19... worked for me, and I could keep using the Crawlera sample script.

Joseph D.
  • 11,804
  • 3
  • 34
  • 67
Raad A.
  • 53
  • 1
  • 8