Lately I've been having performance issues w/ my Comcast cable modem.
I can tell by the response time I get when browsing sites I normally go.
Several speed testing sites yield my speed as 1.5Mbps / 350Kbps tops, rather than the advertised 6Mbps down. I know they're not 100% accurate but at least I get an idea what the speed is like.
So...my question is, how can one accurately measure his connection?
If I ping my default gateway w/ 1500 bytes of ICMP packets, and average round trip time is 200ms, is it correct to say the speed would be:
(1500 / 200) * 10 / 2 = 37.5 bytes / sec?
I can tell by the response time I get when browsing sites I normally go.
Several speed testing sites yield my speed as 1.5Mbps / 350Kbps tops, rather than the advertised 6Mbps down. I know they're not 100% accurate but at least I get an idea what the speed is like.
So...my question is, how can one accurately measure his connection?
If I ping my default gateway w/ 1500 bytes of ICMP packets, and average round trip time is 200ms, is it correct to say the speed would be:
(1500 / 200) * 10 / 2 = 37.5 bytes / sec?