Headline: Is there a problem with sending/receiving data to/from a server across multiple connections/sessions?
Background: One of the problems I see running apps coast-to-coast is that maximum throughput drops off dramatically. In the office, I can hit use ~90% of a line to move data; on a 10M coast-to-coast connection/session I'll only reach ~1.65 Mbps because of loss and latency. If apps were architected for parallel transfer across multiple connections/sessions it would be a different story. I could reach n x 1.65 where "n" would be the number of connections. But all too often, the apps that I see seem to be only using one connection.
So I'm wondering, why don't more apps work across more than one connection? Is it bad practice? Difficult to implement? Resource intensive? etc. etc.