Hard Drive Formatted Size

RonAKA

Member
Feb 18, 2007
165
0
0
Currently formatting a Seagate 320 GB SATA300 drive. Windows XP is showing it will format to 305235 MB. Does that make sense? Or, did they really give me a 300 GB drive?
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
1,000,000,000 vs, (1024)^3 is roughly 93% of stated capacity. 320 * .9313 = 298 GB. So you're 7 GB ahead, w00t!
 

RonAKA

Member
Feb 18, 2007
165
0
0
Thanks, that makes me feel better. I knew the 1024 (power of 2) number, but thought 1 GB equaled 1024 MB.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: RonAKA
but thought 1 GB equaled 1024 MB.

It is, but companies like to use their own arbitrary units when they sell hard drives. Think of it like buying a car advertised as 30mpg but it's really only 25 because "we were using Klingon gallons, not US gallons"

In the fantasy world of hard drive sales, 1GB = 1000MB. In the real world, 1GB = 1024MB.
 

AnitaPeterson

Diamond Member
Apr 24, 2001
6,050
639
126
I just formatted and installed a fresh new 320 GB Seagate in my system...
Windows reports a glorious 298 GB available :D
 

corkyg

Elite Member | Peripherals
Super Moderator
Mar 4, 2000
27,370
240
106
And, don't forget - when you format to NTFS you lose a small bit of the space to system management areas, the metadata files and the Master File Table (MFT) files and reserved zone. Nominally this averages about 5 MB.

NTFS
 

Jiggz

Diamond Member
Mar 10, 2001
4,329
0
76
So now we know! That it is advertised as 1000 MB = 1 GB; and that reality is 1024 MB = 1 GB. So the ratio of the advertised over reality is 1000/1024= 97.7 percent. Which means reality is approx 97.7 percent of advertised. So after formatting do not expect your capacity to be realistically over this percentage. In fact it will be less as mentioned earlier due to reserve data files, FAT, etc...
 

ForumMaster

Diamond Member
Feb 24, 2005
7,792
1
0
Originally posted by: Jiggz
So now we know! That it is advertised as 1000 MB = 1 GB; and that reality is 1024 MB = 1 GB. So the ratio of the advertised over reality is 1000/1024= 97.7 percent. Which means reality is approx 97.7 percent of advertised. So after formatting do not expect your capacity to be realistically over this percentage. In fact it will be less as mentioned earlier due to reserve data files, FAT, etc...

actually, assuming you've ever bought a retail drive, you'll notice that on the box it says that they count gigabytes as 1,000,000,000 bytes. look on the box of an ipod and it will say "1GB = 1 billion bytes; actual formatted capacity less." which i'm quoting from my ipod mini's box.
 

Madwand1

Diamond Member
Jan 23, 2006
3,309
0
76
Originally posted by: ShawnD1
In the fantasy world of hard drive sales, 1GB = 1000MB. In the real world, 1GB = 1024MB.

No, the HD manufacturers get this right; Windows still gets this wrong. G is an SI prefix, and it means 10^9. If you mean base 2, you should use GiB or MiB, etc. Linux has adopted this convention for some time now, and they're teaching it in schools. It's confusing and incorrect to switch the meaning of G from base 10 to base 2 depending on context -- it's error prone and sloppy.

http://en.wikipedia.org/wiki/Gibibyte
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
Originally posted by: RonAKA
Thanks, that makes me feel better. I knew the 1024 (power of 2) number, but thought 1 GB equaled 1024 MB.

By convention the 1GB = 1024 MB is only used for RAM (because the manufacturing/design process for RAM is most efficient if you double sizes).

For other purposes (including tapes, disks, networks, data capture devices, etc.), 1 GB has traditionally = 1000 MB, 1 MB = 1000 kB, etc. because such products weren't traditionally constrained to specific sizes. E.g. tapes could be made any length.

Things have been confused because different manufacturers have had their own personal interpretations. e.g. MS use 1 GB = 1024 MB for file sizes and storage capacity. Seagate have in the past used 1 GB = 1000 MB = 1024000 kB, etc.

 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
The measurement doesn't change because it's a HD.

1024MB = 1GB
1024KB = 1MB

And for the kicker....you get more space striped a pair of 160GB HDs than a single 320GB HD (305GB usable).
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
14
81
www.markbetz.net
Originally posted by: Madwand1
Originally posted by: ShawnD1
In the fantasy world of hard drive sales, 1GB = 1000MB. In the real world, 1GB = 1024MB.

No, the HD manufacturers get this right; Windows still gets this wrong. G is an SI prefix, and it means 10^9. If you mean base 2, you should use GiB or MiB, etc. Linux has adopted this convention for some time now, and they're teaching it in schools. It's confusing and incorrect to switch the meaning of G from base 10 to base 2 depending on context -- it's error prone and sloppy.

http://en.wikipedia.org/wiki/Gibibyte

They get it right according to the 1998 IEC standard adoption. The historical context is what it is, so it's a little off base to say that "Windows still gets this wrong." That historical context goes back further than Windows, and Microsoft. Yes, it makes sense to normalize the use of these terms across all scientific and technical disciplines. No, it's not a big enough deal that companies are eager to change the way they count megabytes. The "correct" way will probably be adopted slowly as people who grew up with the old way retire, and it becomes less important to maintain backward compatibility with that way of thinking about it.
 

archcommus

Diamond Member
Sep 14, 2003
8,115
0
76
It is completely 100% accepted that G means 2^30 in the computer world and NOT 10^9, thus nullifying the need of something like GiB. This is what they teach at the two universities I have learned this stuff at, both of which are fairly high end schools.
 

Madwand1

Diamond Member
Jan 23, 2006
3,309
0
76
Originally posted by: Markbnj
They get it right according to the 1998 IEC standard adoption. The historical context is what it is, so it's a little off base to say that "Windows still gets this wrong." That historical context goes back further than Windows, and Microsoft.

I don't get your point here, because the 1998 IEC standard defined GiB, etc., which therefore removed the binary interpretation from GB. My point is that it's 2007, and MS has a brand new OS, and they still misuse GB in terms of the 1998 standard, SI as a whole to confusion aplenty and no real benefit.

The "we always did it that way" argument is not a good defense for Microsoft, as they regularly change the way things are done, and make high claims about being progressive.

http://physics.nist.gov/cuu/Units/binary.html

Originally posted by: Markbnj
Yes, it makes sense to normalize the use of these terms across all scientific and technical disciplines. No, it's not a big enough deal that companies are eager to change the way they count megabytes. The "correct" way will probably be adopted slowly as people who grew up with the old way retire, and it becomes less important to maintain backward compatibility with that way of thinking about it.

We're supposed to be technically informed here. When AT posters still don't get it, in face of the facts, it's a shame. (I'm not referring to Markbnj here.) Measurements of all things, esp. scientific measurements, should not be so weakly standardized so as to be off by 5% and more just due to the interpretative whims of the writer or reader.

We all need to learn this at some point, so it's fine to get this "wrong" to a certain point, to hopefully understand when some others use the old terminology, and to make efforts to get it right going forwards.

The only parts that I have real problems with are assertions such as "GB = 1024MB", after hearing about GiB. This is simply ignorant, and a perpetuation of sloppy, error-prone terminology.
 

archcommus

Diamond Member
Sep 14, 2003
8,115
0
76
Originally posted by: Madwand1
Originally posted by: Markbnj
They get it right according to the 1998 IEC standard adoption. The historical context is what it is, so it's a little off base to say that "Windows still gets this wrong." That historical context goes back further than Windows, and Microsoft.

I don't get your point here, because the 1998 IEC standard defined GiB, etc., which therefore removed the binary interpretation from GB. My point is that it's 2007, and MS has a brand new OS, and they still misuse GB in terms of the 1998 standard, SI as a whole to confusion aplenty and no real benefit.

The "we always did it that way" argument is not a good defense for Microsoft, as they regularly change the way things are done, and make high claims about being progressive.

http://physics.nist.gov/cuu/Units/binary.html

Originally posted by: Markbnj
Yes, it makes sense to normalize the use of these terms across all scientific and technical disciplines. No, it's not a big enough deal that companies are eager to change the way they count megabytes. The "correct" way will probably be adopted slowly as people who grew up with the old way retire, and it becomes less important to maintain backward compatibility with that way of thinking about it.

We're supposed to be technically informed here. When AT posters still don't get it, in face of the facts, it's a shame. (I'm not referring to Markbnj here.) Measurements of all things, esp. scientific measurements, should not be so weakly standardized so as to be off by 5% and more just due to the interpretative whims of the writer or reader.

We all need to learn this at some point, so it's fine to get this "wrong" to a certain point, to hopefully understand when some others use the old terminology, and to make efforts to get it right going forwards.

The only parts that I have real problems with are assertions such as "GB = 1024MB", after hearing about GiB. This is simply ignorant, and a perpetuation of sloppy, error-prone terminology.
No, this is how they still teach it in universities. I am expected on an exam to put "GB" or "MB." And this is at two different accreditied schools. I'm not about to change when it's not necessary, and really I can't change anyway.

I'm never mixing computer talk and scientific measurements in the same document. It can be accepted that G equals 2^30 in the computer world, and HDD manufacturers just decide to use the 10^9 definition.
 

Madwand1

Diamond Member
Jan 23, 2006
3,309
0
76
Originally posted by: archcommus
It can be accepted that G equals 2^30 in the computer world, and HDD manufacturers just decide to use the 10^9 definition.

This is an artificial and fragile distinction. Hard drives and networks are very much a part of the computer world. G often does not mean 2^30 in the computer world.

Your professors are old and out of date. :)

Edit: BTW, are you implying that if you started writing the more clear and unambiguous "GiB" in your answers that your professors would have a problem with it? Perhaps you should ask one of them.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
Originally posted by: archcommus

I'm never mixing computer talk and scientific measurements in the same document. It can be accepted that G equals 2^30 in the computer world, and HDD manufacturers just decide to use the 10^9 definition.

Really?

While binary interpretations have been used for specifiying system RAM quantities, and file sizes - those are the major limitations of their uses.

Nominal tape and disk capacities have been a mixture of both interpretations right back to the time they were developed (in the context of bulk storage the decimal interpretation has been the most popular). Some manufacturers have preferred one interpretation over another - or sometimes developed a bizarre hybrid.

What about GigaByte and GigaBit? Except in the context of bare solid state memory, where the binary interpretation is used - Mega/Giga bit virtually always implies the decimal interpretation.

Speeds - bandwidths (MB/s, GB/s, Gbit/s), clockspeeds (MHz, GHz), etc. are rarely, if ever, measured with a binary interpretation.

The point is that the *only* time when the binary interpretation has universally been used is in the description of RAM sizing. In virtually every other case, including bulk storage, the decimal interpretation has been used in addition.

The point is that using GB is ambiguous, as in many cases it would be reasonable to interpret it in a decimal context. If you specify GiB then there is no ambiguity at all. Why prefer a system which is prone to ambiguity, over one that is explicit?

Of change is painful and old-habits die hard - but this is a recurrent cause of confusion, and could potentially lead to problems as capacities get progressively larger. While there is a 7% discrepancy with Giga, at Tera it's 10%, and Peta about 13%. The longer it's left, the worse things are going to get.
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
14
81
www.markbetz.net
Originally posted by: Madwand1
Originally posted by: Markbnj
They get it right according to the 1998 IEC standard adoption. The historical context is what it is, so it's a little off base to say that "Windows still gets this wrong." That historical context goes back further than Windows, and Microsoft.

I don't get your point here, because the 1998 IEC standard defined GiB, etc., which therefore removed the binary interpretation from GB. My point is that it's 2007, and MS has a brand new OS, and they still misuse GB in terms of the 1998 standard, SI as a whole to confusion aplenty and no real benefit.

The "we always did it that way" argument is not a good defense for Microsoft, as they regularly change the way things are done, and make high claims about being progressive.

http://physics.nist.gov/cuu/Units/binary.html

Originally posted by: Markbnj
Yes, it makes sense to normalize the use of these terms across all scientific and technical disciplines. No, it's not a big enough deal that companies are eager to change the way they count megabytes. The "correct" way will probably be adopted slowly as people who grew up with the old way retire, and it becomes less important to maintain backward compatibility with that way of thinking about it.

We're supposed to be technically informed here. When AT posters still don't get it, in face of the facts, it's a shame. (I'm not referring to Markbnj here.) Measurements of all things, esp. scientific measurements, should not be so weakly standardized so as to be off by 5% and more just due to the interpretative whims of the writer or reader.

We all need to learn this at some point, so it's fine to get this "wrong" to a certain point, to hopefully understand when some others use the old terminology, and to make efforts to get it right going forwards.

The only parts that I have real problems with are assertions such as "GB = 1024MB", after hearing about GiB. This is simply ignorant, and a perpetuation of sloppy, error-prone terminology.

You can make the same argument about the metric system, and look how well the conversion of the U.S. and U.K. have gone in that regard. Convention is a powerful motivator. Keep up the good fight, but be prepared for a long battle. The only issue I had with your post is the part where you seem to imply it was Microsoft's responsibility to convert over to using the correct terminology in Windows. It obviously was not. They're free to use whatever terminology their customers and partners understand.
 

archcommus

Diamond Member
Sep 14, 2003
8,115
0
76
Originally posted by: Madwand1
Originally posted by: archcommus
It can be accepted that G equals 2^30 in the computer world, and HDD manufacturers just decide to use the 10^9 definition.

This is an artificial and fragile distinction. Hard drives and networks are very much a part of the computer world. G often does not mean 2^30 in the computer world.

Your professors are old and out of date. :)

Edit: BTW, are you implying that if you started writing the more clear and unambiguous "GiB" in your answers that your professors would have a problem with it? Perhaps you should ask one of them.
Well, the professors could very well be out of date, but my textbook, "Computer Systems Design and Architecture," Second Edition, Heuring and Jordan, says:

In normal commercial and engineering usage, the term kilo equals 10^3...The powers of 2 are so commonly found in the treatment of computers, because of the binary nature of the machines, that the proceeding terms have been co-opted to represent the nearest power of 2...You should find it easy to distinguish the two usages. The powers of 2 are most often used in describing memory capacity, whereas the powers of 10 are used to describe clock frequencies, for example.

So there you have it. The terms have been "co-opted," and it's acceptable, and you should be able to "distinguish." Professors would not be okay with me using another notation - just makes it harder for them to grade.

I agree with you, ambiguity in the technical world is not good and I prefer to streamline and simplify things, as well. But in this case it just "works," and even textbook authors are okay with the double meaning of the terms.
 

TJones2

Senior member
Oct 27, 2004
278
0
76
Originally posted by: AnitaPeterson
I just formatted and installed a fresh new 320 GB Seagate in my system...
Windows reports a glorious 298 GB available :D

I hate you! I got 297! Gimme my 1gb back! :laugh: