Calculating Network Latency

Coherence

Senior member
Jul 26, 2002
337
0
0
Here's the theoretical scenario:

Given: A 10 Mbps shared Ethernet network currently has an average network utilization of 30% and a latency of about .001 sec.

Given: Adding a multimedia server will increase traffic by 4 Mbps, and there would be an average of 10% collisions.

According to this scenario, I am being told that the latency on the network will increase from .001 to .01 sec as a result of adding the multimedia server.

I understand how the utilization will go from 30% to 70%, but I have no clue how they figure the latency increase.

Anyone able to explain?
 

spidey07

No Lifer
Aug 4, 2000
65,469
5
76
there aren't collisions in modern day ethernet (full duplex)

aside from that there should be no increase in latency. you would have to know the devices in use, buffer size, etc to calculate latency.

 

Coherence

Senior member
Jul 26, 2002
337
0
0
Originally posted by: spidey07
there aren't collisions in modern day ethernet (full duplex)

aside from that there should be no increase in latency. you would have to know the devices in use, buffer size, etc to calculate latency.

As mentioned, the scenario specifies shared Ethernet, which means collisions can still occur (and can even occur in switched full-duplex Ethernet, though rarely).

Basically, the scenario is saying increased traffic will cause an increase in latency, but I agree with you, not enough information is given to justify such a result.
 

spidey07

No Lifer
Aug 4, 2000
65,469
5
76
ok - this is a homework problem and doesn't actualy occur in real life.
:p

There should be a way to calculate how long the NIC will wait (back off algorithm) due to the extra traffic and collisions.

the year is 2006. shared ethernet doesn't exist.
 

ScottMac

Moderator<br>Networking<br>Elite member
Mar 19, 2001
5,471
2
0
Collision rates anywhere near one percent (or above) would signal a serious network problem.

Even multicasts (and broadcasts) must follow the CSMA/CD rules.

Back in the day, it was not unusual to have dozens and dozens of hosts sharing a single 10Meg segment with little or no serious contention.

Of course, traffic flows and applications have changed since then, but being able to maintain sub-one percent collision rates should not be that big of an issue.

Just take your total byte count and divide it into your collision rate (comparable units / frames or bytes) and that'd be your collision rate.

Good Luck

Scott
 

Coherence

Senior member
Jul 26, 2002
337
0
0
Yes, this basically was a homework question. It was a lab question in one of my MCSE courses.

It's not so much the collision rate that I'm concerned with, since that is a given (in order to calculate Actual Data Throughput, one of the later questions in the lab). What I'm concerned about is how they figure that latency will increase from .001 sec to .01 sec on the network in the sample answer. (I answered the question correctly, but not for the same reason they give, and I'm curious to know how they came up with their answer.)

So, since I didn't give the complete scenario with the original post, I'll give the whole thing here:

You are a systems engineer for Northwind Traders and have been sent to a branch office located in Reading, U.K. The branch office has a 10-Mbps shared Ethernet network consisting of 66 clients and four servers, and a 512-Kbps private leased line to Northwind Traders' headquarters.

This branch office designs and creates advertising materials, including radio, television, and magazine advertisements and special event promotions. The branch is considering adding a streaming media server to its network. Employees at the branch office want to use this new server to review the company's television and radio advertisements.

You have been asked to evaluate the network to determine if it has sufficient capacity to add a streaming media server.

After arriving at the branch office, you made the following discoveries:

  • The network is currently averaging 30 percent utilization.

  • The streaming media server requires less than .001 seconds of latency on the network to operate acceptably.

  • On a test network, you have determined that under normal usage, the streaming media server adds approximately 4 Mbps to the existing network throughput.

  • There is no money available for purchasing new networking equipment in the current fiscal year. However, money may be available in the next fiscal year.

Exercise 1 Task:
1. Can you place this server on the network and provide acceptable performance? Why or why not?

My Answer:

No, because the current average utilization is already too high. 20% average utilization is recommended for a 10-Mbps Shared Ethernet Multimedia network, based on the Recommended Ethernet Utilization Guidelines table from the text; the existing network is running at 30%, and would increase to 70%.

Their Answer:
No. The network is currently running at 30 percent of the utilization rate, giving it a latency of about .001 seconds. However, if the server were added, it would boost the average utilization to 70 percent (30 percent + 40 percent), which would raise the latency to .01 seconds, thus making the server performance unacceptable.
 

spidey07

No Lifer
Aug 4, 2000
65,469
5
76
That explains it.

It is microsoft. They don't know networking.

40% for ethernet shared is acceptible however. It's when collisions start to run 3% or higher that you can really slow things down. Heck some ethernet segments can run at 50% or higher and be OK. 70% however is too much and you would probably see collisions ramp up exponentially.