I just had a strange conversation with a man who was trying to explain to me that it is impossible for two healthy networks to communicate at each-other over the ocean without significant bandwidth loss.
For example - if you have a machine connected at 100Mb/sec here http://www.hetzner.de/en/hosting/unternehmen/rechenzentrum attempt to communicate to a machine in the US with exactly the same setup you'd only achieve a fraction of the original connection speed. This would be true no matter how you distributed the load - the total loss over distance would be the same. "Full capacity" between the US and Germany would be less than half of what it would be to a data center a mile from the originator with the same setup.
If this is true that means my entire understanding of how packets work is wrong. I mean, if there's no packet loss why would there be any issue other than latency? I'm trying to understand his argument but am at a loss. He seems intelligent and 100% sure of his information. It was very difficult to understand because he explained data like a river and I was thinking of it as a series of packets.
Can someone explain to me what I'm missing, or am i just dealing with a madman in a position of authority and confidence?