Decimal vs Binary Prefixes

kevinsbane

Senior member
Jun 16, 2010
694
0
71
Do you use a decimal megabyte (10^6 bytes), or a binary megabyte (2^20 bytes)? Why?

So for those who insist on binary prefixes, if Windows switched to a decimal kilobyte/megabyte/gigabyte/terabyte, would you still care to use traditional JEDEC binary prefixes for your bytes? (I was going to say bits and bytes, but no one uses binary prefixes for bits! A gigabit connection = 10^9 bits/second, SATA II = 3x10^9 bits/sec, etc.)

One more thing; the official binary prefixes - mebibyte, kibibytes, gibibytes; I think they sound terrible. If we switch to using a decimal megabyte, then when would we ever need to use a binary mebibyte?
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I am programmer, my world is powers of 2. There are many many things that only work properly or well near those boundaries. The computer fundamentally is based on binary maths and its what I need. For making things easy for users they can throw out decimal megabytes and change the standard to be mebibyte and other rubbish all they want but its not going to change how an expert uses the machine one tiny bit.

From a practical perspective I don't think people without knowledge of computers understand one way or the other anyway. I have family that doesn't know the difference between RAM and drive space so the finer points of the difference between decimal megabytes and binary megabytes are completely irrelevant to them. But to me its a number in the format of the machine itself.