I have a working Nginx setup with OCSP stapling configured. Now I want to add client certificate authentication for a number of URLs.
So I added a ssl_client_certificate
statement that points to the CA certificate we use for the restricted URLs (it's a private CA certificate, not signed by any publicly-known CA) and because most of the server should still be publicly available, I set ssl_verify_client
to optional
.
This works only half-way: I can run requests with cURL(*), both with and without passing a certificate and receive the expected responses on both public URLs as well as protected ones that check the presence of the certificate.
But now for my question: when accessing the same URLs with a browser (not presenting a certificate), Nginx responds with error 400. What baffles me is that when I use Firefox Developer Tools to create a cURL request out of any of the failing requests and run them from the command line, it works flawlessly. What could be the problem?
Also even cURL's -v
and --trace-ascii
don't show anything that would explain to me why it could fail from within a browser. I'm not pasting the whole (long) configuration here, if you think there's something elementary missing, feel free to comment.
Edit: I checked and confirmed that cURL sends Host
, User-Agent
, Accept
, Accept-Language
, Referrer
, DNT
headers as well as cookies and session ID and enables compression just like Firefox would do it.
Also both Firefox and cURL don't have any client certificates that they could offer to the server and Firefox is configured to ask for certificates instead of offering them automatically, too.
Another edit: after coming back from lunch (no configuration changes in the meantime) Firefox could load the first page and associated resources once. Now, a few minutes later with only trying various requests, i.e. no changes, it doesn't work anymore. Also Chrome reports error 400 and using "Copy as cURL" (which again includes all headers) from its developer tools shows that it works in cURL again. Also I tried all requests multiple times to make sure that there's no inconsistency within the behaviour shown towards a single user agent. I'm stumped, it all seems very random to me.