- Mar 15, 2001
- 3,091
- 0
- 0
Can someone explain the exact relationship between the overall latency of a connection (lets use Microwave-based broadband) and the rated speed (1mbs). We have this setup at my dad's office. When browsing the web and downloading random files, the _speed_ seems decent (pulling 200k/sec), but when I try to tie in from home on my Cable modem via PC Anywhere, it is slow as crap becuase of the latency (~250ms). What is the exact relationship between the two? How can the downloads be fine, but latency be so high? To add another question to this post; will a VPN connection to the office still be as slow as connecting in through PC Anywhere?