I want to provide a git repo with tutorials and examples of WebRTC applications, for people who are just starting to learn and don't have any grasp at all on the technology involved. These tutorials shouldn't need to change once the code is written: users would just download, serve the test page, and run. WebRTC is peer-to-peer, so these tutorials must be able to run in LAN without access to internet.
And here is where the problems begin with the current industry situation of security and self-signed SSL certs:
WebRTC is mandated by browsers to require a Secure Context, i.e. to be served through HTTPS, so self-signed certs necessarily come into play. Browsers offer an exception for
localhost
but that's far from enough, because learners won't be running a web server in their phones to test with localhost... I want to support the case of running the tutorial's server on a PC or Mac and accessing it from other computers or phones in their LAN.The typical solution seems to be including a self-signed cert in the repo and using it to serve the tutorials. But LAN IPs can vary wildly and I cannot assume the IP address of the server where people will run the tutorial.
Cannot use multi-level wildcards. So it's not possible to issue a self-signed cert for
192.168.*
(or similar for the other private IPv4 ranges)Cannot use wildcards for IP address, at all (see https://serverfault.com/a/770225). I don't understand why this limitation for a self-signed cert. So it's not even possible to cover
192.168.0.*
.Honestly, self-signed certs are simply not a good general solution, to start with. Even after issuing it for the correct IP, the matter of trusted vs. untrusted comes into play, and is totally browser-dependant.
E.g. Chrome accepts it after an "untrusted cert" warning page; iOS Safari silently rejects it. So now, with self-signed certs, some totally out of scope extra steps must be introduced, to instruct the user about installing the self-signed Root CA into their device's trusted store. That's far from ideal.
After a time interval that is ever decreasing more and more, the certs will expire, rendering the tutorials repository unusable if left as-is. We shouldn't have to perpetually update the repo with new certs every N months. This is just a static set of code that is written to showcase a specific functionality, not an exercise on devops and server management.
Solutions I've seen everywhere are all in the line of:
"Don't use IP address, use a domain name". Doesn't apply. This is a tutorial, people will run it in a typical LAN from their laptop and its IP will be
192.168.1.32
or similar private range, and they probably won't even have the knowledge about DNS servers, domain names, etc. Also, remember that the LAN itself might not even have access to internet."Build a CI job that automatically renews the certificate files". I understand it is currently the best we can do, but it seems an absurdly over-engineered solution that shouldn't be required to begin with. This is not a public-facing service, it's a .zip file with some example tutorials inside.
In summary, I feel like the industry story on SSL certificates is too focused on the "happy path" of internet-facing web servers with well-known domain names, while perfectly valid cases have been ignored or swiped under the rug. The self-signed cert creation rules are too strict, and limit their usefulness. Also, cert maximum validity is shortened more and more as years pass, and the decisions are again blindly driven by the happy path use case, without making an effort to provide solutions for valid but less common cases.
Given the current rules of the game, for this kind of private single-use programs,
How to make such applications work in a way that causes the least additional effort for end-users (learners)?