Here's the theoretical scenario:
Given: A 10 Mbps shared Ethernet network currently has an average network utilization of 30% and a latency of about .001 sec.
Given: Adding a multimedia server will increase traffic by 4 Mbps, and there would be an average of 10% collisions.
According to this scenario, I am being told that the latency on the network will increase from .001 to .01 sec as a result of adding the multimedia server.
I understand how the utilization will go from 30% to 70%, but I have no clue how they figure the latency increase.
Anyone able to explain?
Given: A 10 Mbps shared Ethernet network currently has an average network utilization of 30% and a latency of about .001 sec.
Given: Adding a multimedia server will increase traffic by 4 Mbps, and there would be an average of 10% collisions.
According to this scenario, I am being told that the latency on the network will increase from .001 to .01 sec as a result of adding the multimedia server.
I understand how the utilization will go from 30% to 70%, but I have no clue how they figure the latency increase.
Anyone able to explain?