Recently, I also encountered that problem. I was behind a proxy server and because my company have set it up (ofc) in the browser, it works in the browser. What I did is passing the proxies to requests.get()
:
import requests
http_proxy = "proxy_server_url"
https_proxy = "proxy_server_url" # could be the same as the http proxy
proxies = {"http": http_proxy, "https": https_proxy}
url = "http://example.com"
response = requests.get(url, proxies=proxies)
Since this works when you're behind a proxy, it doesn't if not. To make this still work, even you're not behind a proxy, make a check:
import requests
import urllib
from urllib import request, error
def check_proxy(self, http_proxy, https_proxy):
is_bad_proxy = False
try:
proxy_handler = urllib.request.ProxyHandler({"http": http_proxy, "https": https_proxy})
opener = urllib.request.build_opener(proxy_handler)
opener.addheaders = [('User-agent', 'Mozilla/5.0')]
urllib.request.install_opener(opener)
req = urllib.request.Request("https://google.com") # change the URL to test here
sock = urllib.request.urlopen(req)
except Exception:
is_bad_proxy = True
if is_bad_proxy:
proxies = None
else:
proxies = {"http": HTTP_PROXY, "https": HTTPS_PROXY}
return proxies
http_proxy = "proxy_server_url"
https_proxy = "proxy_server_url"
proxies = check_proxy(http_proxy, https_proxy)
url = "http://example.com"
response = requests.get(url, proxies=proxies)