0

I am using fiddler as a proxy and trying to open a connection to an application. Here are the steps:

Proxy Code:

s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)

timeval = struct.pack('ll', 3, 0)
s.setsockopt(socket.SOL_SOCKET,socket.SO_RCVTIMEO, timeval)
s.setsockopt(socket.SOL_SOCKET,socket.SO_REUSEADDR,1)
s.connect((ip_addr, port)) #Proxy connection

request_header = "CONNECT {0}:{1} HTTP/1.1\r\n".format(HOST, 443)
s.send(request_header)

request_header = "Host: {0}:{1}\r\n".format(HOST, 443)
s.send(request_header)

request_header = "Proxy-Connection: keep-alive\r\n"
s.send(request_header)

request_header = "\r\n"
s.send(request_header)

recvd = s.recv(1024)

HTTP/1.1 200 Connection Established

Now I am using this socket to connect to the application:

Code:

header_format ='''GET /db/cert/im/d HTTP/1.1\r
Host: {0}\r
Content-Type: text/html\r
Accept: <relevant details>\r
\r

'''
s_sock = ssl.wrap_socket(s, keyfile="test_cert.key", \
    certfile="test_cert.pem", server_side=False, do_handshake_on_connect=True)
request_header = header_format.format(HOST)
s_sock.write(request_header)
data = s_sock.read()
print data

DATA:

HTTP/1.0 401 Unauthorized
WWW-Authenticate: Basic realm=""
Server: SomeServer
Connection: Keep-Alive
Content-Length: 35

I want to know how to access the site with PKI security using proxy?

UPDATED The progress I made so far, I am trying to use httplib and urllib2

proxy = urllib2.ProxyHandler({
    'http': ip_port,
    'https': ip_port
})

Below implementation copied from here: enter link description here

class HTTPSClientAuthHandler(urllib2.HTTPSHandler):
    def __init__(self, key, cert):
        urllib2.HTTPSHandler.__init__(self)
        self.key = key
        self.cert = cert
    def https_open(self, req):
        return self.do_open(self.getConnection, req)
    def getConnection(self, host, timeout=300):
        return  httplib.HTTPSConnection(host,
                                             key_file=self.key,
                                             cert_file=self.cert,
                                             timeout=timeout)



def open_url(url, key, cert):

    opener = urllib2.build_opener(HTTPSClientAuthHandler(key, cert))
    urllib2.install_opener(opener)
    opener.addheaders = [
        ("User-Agent", "<custom>"),
        ("Accept", "<custom>"),
    ]
    response = urllib2.urlopen(url)
    print response.read()

The above implementation works properly, as soon as I add proxy handler I get 401 error. opener = urllib2.build_opener(proxy,HTTPSClientAuthHandler(key, cert))

File "test_proxy.py", line 54, in open_url
    response = urllib2.urlopen(url)
  File "/usr/lib/python2.6/urllib2.py", line 126, in urlopen
    return _opener.open(url, data, timeout)
  File "/usr/lib/python2.6/urllib2.py", line 397, in open
    response = meth(req, response)
  File "/usr/lib/python2.6/urllib2.py", line 510, in http_response
    'http', request, response, code, msg, hdrs)
  File "/usr/lib/python2.6/urllib2.py", line 435, in error
    return self._call_chain(*args)
  File "/usr/lib/python2.6/urllib2.py", line 369, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.6/urllib2.py", line 518, in http_error_default
    raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
urllib2.HTTPError: HTTP Error 401: Unauthorized
Nabz C
  • 538
  • 1
  • 9
  • 28
  • 1
    *"I want to know how to do cert, key authentication here?"* - you are already doing this correctly. But it looks like the server is (additionally?) expecting basic authentication which you can see in the `WWW-Authenticate: Basic realm=""`. Basic authentication has nothing to do with certificate based authentication but is based on username and password. – Steffen Ullrich Feb 20 '19 at 04:39
  • What I want is to avoid username and password based authentication and force it to do cert/key based one. – Nabz C Feb 20 '19 at 05:19
  • A client cannot force the server to accept certificate based authentication. It is fully up to the server what methods of authentication gets accepted. – Steffen Ullrich Feb 20 '19 at 06:09
  • I am able to access that site following the example given here https://stackoverflow.com/a/30937625/4268132, but I need to do that using a proxy. – Nabz C Feb 20 '19 at 07:27
  • It might be then that the specific server does not like your specific HTTP request which is likely different from the one send with httplib (for example I'm pretty sure that httplib will not send the `Content-type` header you use and which does not make any sense for a GET request) and that's why ask for additional authentication. I suggest that you use [requests](http://docs.python-requests.org/en/master/) which can do both deal with client certificates and also deal properly with proxies by its own (which httplib does not seem to do). – Steffen Ullrich Feb 20 '19 at 07:41
  • @SteffenUllrich Tried with requests. `response = requests.get(url, headers=headers, proxies=proxies, cert=(cert,key))`, still getting __. asa i remove the proxy option, it works well – Nabz C Feb 20 '19 at 08:44
  • That's pretty strange. Could it be that there is some additional authentication by IP address and that the server is only accepting requests from a specific IP address and not from the proxies IP address? – Steffen Ullrich Feb 20 '19 at 08:50

0 Answers0