Note that I posted this Q to the StackExchange InfoSec site, but its not as populated as ServerFault and this is more on the technical side of network collection for web services.
I've started thinking about how to approach analyzing my network traffic for information relating to SSL and browser usage for my web server farm. I also suspect there are some shadow (legacy, unknown) web servers running on our network and I'd like to try and capture connections to them as well.
Basically I'm looking for a cheap solution that can:
- Record all connections and their protocol, similar to how Wireshark will log src/dst and protocol, but not collect the data itself. Just header and protocol type information. (e.g. source/destination IP addresses, protocol and any protocol version/details, like SSL/TLS version, etc...)
- For HTTP connection, record the browser "User-Agent" attribute. (so I can passively build a bit of a browser version/compatibility picture)
- Being able to do so for non-standard ports would be a plus. Limiting the collection to only specific protocols would be idea. e.g. only collect HTTP, SSL and TLS.
I would end up plugging a server dedicated to collecting this information into a mirrored port on each switch under my control.
This would help with follow on activities like:
- Find out which web services are permitting weak SSL/TLS versions. e.g. if I see any SSLv2/3, I would consider finding and re-configuring those web servers to minimum TLS v1.1.
- Get a picture of the end user web browser make and version landscape, so I could make decisions on what minimum security changes could safely be made. e.g. if 15% of browsers are IE8 on Vista(!) which doesn't support TLSv1.1+, I would know ahead of time. Also, I could review those IPs and determine if any of them are legacy network resources that should be investigated, not just some end user with out of date software.
- Determine which web service IP addresses are not aready known so I could track those services down.
The reason I'm asking and not just plowing ahead with experimenting with tcpdump and OpenDPI is performance worries. I'm not sure if or what might happen when I mirror a Gigabit port. It might prove difficult not just to capture this information (remember I don't want a packet dump, just header type info, and it can be collapsed records, I don't need to know every single connection, just the src/dst/protocol unique ones) but to determine if the capture is unsuccessful. e.g. how would I know that 50% of connections were ignored?
I would likely run this for a week to let a user profile build up.
If anyone have any suggestions or recommendations, I'm all ears. Thank you.