Originally posted by: ElFenix
the computer guys got it wrong. the REAL standard of gigabyte is 1 billion bytes. not 1073747824 bytes. in fact all the manufacturers even state what a REAL gigabyte is. the hard drive makers are right, the computer user AND the operating system are wrong. if this thing isn't tossed by the judge i'll be amazed.
Exactly.
It?s Microsoft that need to update their operating system to reflect GB = 1,000,000,000.
All international standards say a kilobyte is 1000 bytes ? not 1024. Kilo- is a Greek prefix meaning 1000 ? not 1024. Likewise a MB would be = 1,000,000 and so on. It seems to me that calling a kilobyte 1024 bytes is a leftover dinosaur from the past because memory was made in 256, 512, 1024 ? etc. chunks ? and 1024 was ? "about" ? a 1000, ? so we used an easier and
rough estimation that was not precisely correct.. It's a redefinition that I think needs to be dropped. 1000=1000 whether your counting in binary, hex or whatever. So what if it doesn't make for nice even numbers in binary.
If I say I have 6,400,000,000 bytes ? is that 6.4GB or 5.96GB or even 6,103.5MB???? The translation is clumsy if I use the "computer" conventions and wrong if I stick to the technically proper usage of GB. If I write it out as 6,400,000,000 bytes ?well it's 6,400,000,000 bytes. But if I abbreviate it, it's all of sudden it's 5.96GB? The common usage of the term seems to be changing as evidenced by disk drive manufactures.