Originally posted by: Pariah
Originally posted by: Rhin0
Oh and here is a post I saw on a review about the Hitachi, some guys little rant.
"chugger,8/26/2004 12:16:55 PM
Funny huh? Computers use binary, people use decimal, used to be computer people used binary too. A kilobyte of memory was and still is 2^10 bytes, a megabyte of memory was and still is 2^20 bytes. And a 20 megabyte hard drive was 20 * 2^20 bytes. But the 250 gigabyte hard drive I have today is reported by windows as being 232 gig. Overhead? Hardly. 232 * 2^30 is about 250 billion bytes decimal which sounds like more, sells better. Somewhere along the way I suppose some marketing department decided they could increase sales by rating there drive in decimal rather than binary. Of course to stay competitive, one does it they all must. Floppy capacity, you used to get what was stated on the label, still do, 700mb cdr you get the true 700 * 2^20 bytes. Jump to DVDR and all of a sudden it's decimal. A 4.7 gig DVDR is a computer device used by computer people to backup computer data, all base two, but it goes to base 10 like the hard drives. So a computer gig and a people gig is two different things, base 2 vs. base 10 and you can only get something like 4.36 computer gigs on a 4.7 gig people DVDR. Crazy. Like try to find a gas station where the price doen't end in point nine cents. Sorry to rave, just a peeve of mine. "
Not true. It's not a marketing stunt, because hard drives have been using decimal to describe capacity, if not forever, then at least long before computers were mainstream, which is also long before the binary decimal difference made a difference. Claiming this is all a marketing stunt would be to claim that hard drive manufacturers had an inhuman amount of foresight with an incredibly longterm plan to try and trick people out of their money. The binary-decimal mess can be blamed on the geeks of old for not coming up with a different naming scheme for binary counting. When dealing with bytes and kilobytes, the difference was so small that it didn't matter, unfortunately, decades later when dealing with gigabytes and terabytes, we now see the error of their ways.
It should also be noted that all data transmission are measured in decimal, not binary. PCI bus is 133.3MB/s, right? Only in decimal, it's 127MB/s binary. 28.8K modem should mean it transfers at 29,491 bytes/sec, right? Nope, it's decimal, so only 28,800 bytes/sec. Dual channel DDR400 supposedly gets 6.4GB/s bandwidth according to the tech specs, but that's decimal, in binary, you're getting under 6GB/s. Why don't people ever complain about the marketers ripping off customers in those situations?
Because the truth is, even most tech savvy people have no idea when binary is used and when decimal is. Computers are not all binary, so just because something isn't in binary, doesn't mean the industry is trying to rip you off.