Wait a minute... is network speed measured in decimal (10^x) or binary (2^x) ??

destrekor

Lifer
Nov 18, 2005
28,799
359
126
All my life I felt led to believe that network bandwidth was measured in binary. But now I hear it's actually is the decimal kilo/mega/giga and not the binary kibi/mebi/gibi ? I just figured it wasn't until recently that the binary version got its own name and everyone was reluctant to change one of the most prominent measurements the layperson even sees (internet speed).

edit:

Edited title for clarity. Note that I am not discussing the differences between bits and bytes and how a bit vs byte is abbreviated and capitalized. This is about whether 1Kbit/1Kbps (as network speed is measured) is equal to 1000 bits, or 1024bits.
 
Last edited:

Azuma Hazuki

Golden Member
Jun 18, 2012
1,532
866
131
IIRC it's binary, but the real deception is they measure in kilo- and mega- bits, not bytes, so they can flash an impressive-sounding number at you. Divide those numbers by 8 for the real truth...and suddenly that 10Mbps (note the lowercase b...) connection doesn't sound so impressive.
 

Red Squirrel

No Lifer
May 24, 2003
69,690
13,319
126
www.betteroff.ca
Like megabits? I think it's true metric. Ex: 1Mbit = 1000 Kbit. I think? Now I'm not sure either... lol. For bytes I know it's 1MB = 1024KB and so on. But then it depends on the system, everyone does it their own way. Hard drive manufacturers use 1TB = 1000GB and so on. That's why a 1TB drive is like 900GB because 1,000,000,000,000/1024/1024/1024/1024 = 0.910TB. In some systems they'll use TiB to represent real TB but not all. So it gets really misleading.
 

shortylickens

No Lifer
Jul 15, 2003
80,287
17,080
136
the excuse they use is its a holdover from ye olden days of early university internet, when things were measured in bits because thats how fast they originally worked. 10 bits a second, 20, 30. Didnt make much sense to use bytes at the time.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
IIRC it's binary, but the real deception is they measure in kilo- and mega- bits, not bytes, so they can flash an impressive-sounding number at you. Divide those numbers by 8 for the real truth...and suddenly that 10Mbps (note the lowercase b...) connection doesn't sound so impressive.

I guess I should be clear that, personally, both decimal/metric (10^x) and binary (2^x) abbreviations are perfectly understood. I just wasn't sure on this use case, because I feel the very broadly understood internet measurement metric (whether there is any significance/understanding for the majority, who knows) predated the official designation of 2^x becoming bit, kibibit, mebibit, gibibit, etc (and kibibyte, mebibyte, gibibyte ...). If it was already broadly understood to mean one thing, I could see the reluctance to move to the proper term.

I'm also suspicious that the industry stealthily moved to using the real speed represented by their Mbits versus Mebibits. Worse, Mbit et all are an accepted but informal abbreviation for the binary kibi, mebi, gibi ... So it could mean whatever the ethernet industry wants it to mean. Perhaps it switched and all the equipment now falsely sooth us into believing what we want. Damn.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
the excuse they use is its a holdover from ye olden days of early university internet, when things were measured in bits because thats how fast they originally worked. 10 bits a second, 20, 30. Didnt make much sense to use bytes at the time.

No I understand that. I'm talking the different between kilo and kibi, mega and mebi, giga and gibi. Not bit vs byte, but two entirely separate measurements (10^x vs 2^x).
 

sdifox

No Lifer
Sep 30, 2005
98,732
17,214
126
Was it always that way, metric that is?
I feel like I could swear that I had been taught it was binary. Perhaps the binary was just involved in the many other components of networking that I had previously remembered but find that knowledge has been no longer tained.

As far as I remember network side was always metric. Blame Gates for the unit problem.

Actually I think the issue predates Gates.
 

Red Squirrel

No Lifer
May 24, 2003
69,690
13,319
126
www.betteroff.ca
Someone should make an OS that uses base 7 or something weird like that, just to screw with people. I could see Apple do that. lol

"We removed a bit, that takes courage!"

Since hard drives and networks are base 8 you could use the extra bit for parity or something lol.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Someone should make an OS that uses base 7 or something weird like that, just to screw with people. I could see Apple do that. lol

"We removed a bit, that takes courage!"

Since hard drives and networks are base 8 you could use the extra bit for parity or something lol.

Base 2, binary, not base 8. 2^x. Decimal is base 10 (10^x), Hex is base 16 (16^x).