1024 > 1000... and it cost me $180

Sy

Member
Aug 3, 2000
27
0
0
Just built a fileserver. I wanted 3TB of total storage in RAID 5. I knew that hard drives were calculated using 1GB=1,000,000,000 (1000*3) bytes and everything else calculates 1GB=1,073,741,824 (1024*3) bytes. This never used to bother me when I used smaller drives like a 40 gigand windows would recognize it as 37.25. But as hard drives get bigger and bigger the gap seems to be widening. The 11x 300 gig hard drives I bought to run in RAID 5 would equal:
10X300 = 3000 gig = 3TB

When actually it comes out to 300gig = 279.48gig:
10x279.48 = 2794.8 gig = 2.729TB

To actually get windows to see 3TB I needed to use th 12 drive (which I was going to use as a hot spare):
11x279.28 = 3074.28 gig = 3.00TB

Anyone know why drive manufacturers are the only ones that use 1gig=1,000,000,000 bytes rule? With drive sizes increasing I really with they would start marketing their drive with the actual size.

~Sy
 

thecoolnessrune

Diamond Member
Jun 8, 2005
9,673
583
126
Its easier and makes the customer believe he is getting more for his money. think if a company tied to be honest for a change. "This would only give me 2.739TB of space with this company, but I can get 3TB by going over here" So you see, if any company were to actually do the right thing and change, it would go nowhere. So everyone stays the same.