Duplicate acknowledgement: see my answer below for further information
I'm building a simple personal application to learn about binary protocol programming, and for added security I'll be encrypting the protocol using SSL. My context is thus "personal project/experience," not "major enterprise-class operation."
I'm curious how SSL responds to compression; would I get better performance encrypting my datastream then compressing it (so I'm compressing SSL data); or compressing my data then encrypting it (so SSL is encrypting compressed data)?
My underlying protocol is part of a general networking library that will be used to transport event messages and data payloads between hosts, and at this point in the design phase I expect most of the data will be binary, with only low to moderate amounts of ASCII text.
My answer in advance to the possible suggestion of "implement both approaches and see what happens" is that I could make the protocol able to use both approaches, and switch between the two to get performance metrics... but it would probably take weeks or months of actual usage to determine mathematically-superoptimized settings, because this will be a general-purpose network library in a bigger application... and so I'm asking this question precisely because I don't want to go there and be that OCD. :P