It's a question concerning benchmarking, but I don't understand the last part about sampling. How big is a sample? Is each sample one byte? Wouldn't seem to make sense: Here it is:
"A secret government agency simultaneously monitors 100 cellular phone conversations and multiplexes the data onto a network with a bandwidth of 1 MB/sec and an overhead latency of 350 us per 1-KB message. Calculate the transmission time per message and determine whether there is sufficient bandwidth to support this application. Assume that the phone conversation data consists of 2 bytes sampled at a rate of 4 Khz."
huh?
"A secret government agency simultaneously monitors 100 cellular phone conversations and multiplexes the data onto a network with a bandwidth of 1 MB/sec and an overhead latency of 350 us per 1-KB message. Calculate the transmission time per message and determine whether there is sufficient bandwidth to support this application. Assume that the phone conversation data consists of 2 bytes sampled at a rate of 4 Khz."
huh?
