Maximum theoretical bandwidth is a function of latency. Does anyone have a formula for this? I am trying to quell an argument at work. For example, server A has a maximum bandwidth output of 3 Mbps. Client X has a latency of 60ms roundtrip to the server, and Client Z has a latency of 500ms roundtrip to the server, what is the maximum theoretical bandwidth these clients will receive? A forumla would be great....