1 gb = ?

Daishiki

Golden Member
Nov 9, 2001
1,943
36
91
i was in an irc chat and we were talking about hdds, and someone asked how many mb were in a gigabyte. a bunch of us immediately said 1024, but one guy contested that due to some new system it is 1000 mb. anyone got any idea?
 

RSMemphis

Golden Member
Oct 6, 2001
1,521
0
0
Ah, the old story.

Well, in principle, giga means billion, mega means million, kill means thousand.

By that, 1 Gigabyte = 1000 Megabyes

However, in the industry, an often used quantity is the GByte, and MByte, and since a computer works on a 2 base, that's where the 1024 comes from:
2^10 = 1024

So, a GByte = 1024 MByte, or 1 GByte = 1.073741824 Gigabyte
 

halkebul

Senior member
Aug 26, 2002
320
0
0
Learned in college that 1GB = 1 x 2^10 x 2^10 x 2^10 bytes. If you go amongst the learned saying 1GB = 1000000000B then they are just going to laugh at you.
 

neuralfx

Golden Member
Feb 19, 2001
1,636
0
0
No. SI is the Systeme International, an organization that defines standard units for scientific purposes such as grams, meters, etc. 1 GigaByte is EXACTLY 1024 Megabytes. I think what RS was trying to say is that
1 GB = 1024 x 1024 x 1024 Bytes or also 1GB = 1024^3

Actaully we should say that 1GB is DEFINED AS 1024^3 bytes. It makes it odd that we use the same Latin prefixes, but do not confuse the SI units as standard computing units. The reason we use 1024 is not because its a multiple of 2, we could use the normal 1000 increments if that were the case, it is because it is a multiple of 8 - which is a significant number in computing.

Also, note that Bits do not follow this, Bits use the standard Latin prefixes along with the standard value, so 1 Gigabit is DEFINED AS one-billion bits.
-neural
 

PowerMacG5

Diamond Member
Apr 14, 2002
7,701
0
0
As everyone who already posted here said, 1 Gigabyte is actually 1024 Megabytes because computers use the base 2 (0's and 1's) number system. But, what nobody else stated was that when HDD are marketed, they are marketed using Base 10. So in the HDD manufactures eyes, 1 Gigabyte equates to 1000 Megabytes. For example, my WD1200JB is rated at 120 GB (in Base 10); but in Base 2, I only have 111 GB.
 

halkebul

Senior member
Aug 26, 2002
320
0
0
The 120GB label is just an approximation. Multiply the Bytes-per-sector times the sectors-per-drive to obtain the amount available to the user. For example, if ya have a Western Digital Special Edition 120GB hard drive, 512 x 234,441,648 bytes is available to the user. 512 bytes per sector and 234,441,648 sectors per drive.
 

SexyK

Golden Member
Jul 30, 2001
1,343
4
76
KraziKid is exactly right, the manufactuers say that 1GB = 1000MB because it pumps up their advertiseable size. Look at any HD advert and you'll see something in little letters that says "1GB= 1000MB". Actual usable size is usually closer to 1GB = 1024MB's.

Kramer
 

CQuinn

Golden Member
May 31, 2000
1,656
0
0
the manufactuers say that 1GB = 1000MB because it pumps up their advertiseable size

It is also the only way they have to give an accurate measure of drive capacity prior to an
OS putting a file system on the drive. Not all Operating systems format the drives in the
same way.
 

SocrPlyr

Golden Member
Oct 9, 1999
1,513
0
0
haha stop it
one gigabyte is only 1000 megabytes...
just check out the links i already sent
when computer makes first started they were lazy and just used the SI terms and w/ the smaller numbers it wasn't problem... but now that we are talking so many gigs etc... it really makes a difference...
i am ignoring how it is used and going strictly on what they say it is and the answer is on the pages i provided but here is a little crap just to show you...
1 megabyte = 1000000 (10^6) bytes
1 gigabyte = 1000000000 (10^9) bytes
1 mebibyte = 1048576 (2^20) bytes
1 gibibyte = 1073741824 (2^30) bytes
now that is fact... how companies actually use them i guess is up to them... the only ones right now that use the true megabyte definitions are the drive manufacturers... harddrives and zip drives and such...

Josh
 

Pink0

Senior member
Oct 10, 2002
449
0
0
The above post is correct. Many people confuse mebi and mega as well as giga and gibi
 

ScottMac

Moderator<br>Networking<br>Elite member
Mar 19, 2001
5,471
2
0
Actually, if you search the board, you'll find a bunch of posts where (paraphrasing) " WFT, where'd all my drive space go?" -

That's from the drive manufacturers saying they're 40 gig (meaning 40,000,000,000 bytes), and the OS reporting 40 gig (40 *1024 megabytes) ... so the reported size comes out looking smaller than the drive's reported capacity.

If you read the small print on the drive literature, you'll see that they use 1,000,000,000 as a gig.

FWIW

Scott
 

ChrisIsBored

Diamond Member
Nov 30, 2000
3,400
1
71
josh, you're an idiot.

1 Gig = 1024MB

You learn stuff like that as a basic premise in programming.

The HDD manufacturers are using a different system but your OS still reads 1024K as a MB as well.

 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Originally posted by: chrisisbored
josh, you're an idiot.

1 Gig = 1024MB

You learn stuff like that as a basic premise in programming.

The HDD manufacturers are using a different system but your OS still reads 1024K as a MB as well.

Charming post. SocrPlyr is still correct though.
 

Pink0

Senior member
Oct 10, 2002
449
0
0
You learn stuff like that as a basic premise in programming.

Wow, you had some bad teachers. What they should have told you is that this is a common misconception that has been perpetrated by microsoft and it is very wrong. A gigabyte is 1,000,000,000 bytes. not 1024Megs. 1024 Megs is a gibibyte.

Data is not related to an operating system. 1,000,000,000 bytes sitting on a hard drive is a gigabyte. That hard drive can be mounted on any number of operating systems. That data is not part of windows or microsoft's retarted file system.

Your post was rude, ignorant, and wrong. Both ScottMac and Socrplyr are completely right and both provided supporting evidence; even links.

While it is true that microsoft defines a Gigabyte as 1024 megs they are very, very, very, very wrong. Microsoft file systems are not the only file systems around you know.
1 gigabyte = 1000000000 (10^9) bytes
1 gibibyte = 1073741824 (2^30) bytes
That's raw ammounts of data. microsoft can call it whatever they want but 1 gigabyte = 1000000000 (10^9) bytes no matter what.
 

Pink0

Senior member
Oct 10, 2002
449
0
0
Learned in college that 1GB = 1 x 2^10 x 2^10 x 2^10 bytes. If you go amongst the learned saying 1GB = 1000000000B then they are just going to laugh at you.

Well, these "learned" people obviously aren't learned enough. If they were then they would know the basic units of measurement for data which includes the gibibyte. They can laugh all they want but what they're calling a gigabyte is actually a gibibyte and nothing will change this.
 

Pink0

Senior member
Oct 10, 2002
449
0
0
It is also the only way they have to give an accurate measure of drive capacity prior to an
OS putting a file system on the drive. Not all Operating systems format the drives in the
same way.

That's right, CQuinn. Some operating systems use the correct measurement for data.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
"While it is true that microsoft defines a Gigabyte as 1024 megs they are very, very, very, very wrong. Microsoft file systems are not the only file systems around you know."

This "problem" is not MS's creation nor should they be blamed for continuing someone else's "mistake". I know some people like to blame all the world's problems on MS, but this is not their fault. It was a bad coincidence 2^10 and 10^3 were similar numbers as well as 2^20 and 10^6 and so fourth. Had that not been the case different prefixes would have been created for the binary representations. Way back when computers were first being used storage was very small and the percentage of difference between binary kilo and decimal kilo was nothing to worry about. Now with 200GB hard drives the percentage difference is starting to get very large so now people are starting to notice more than they ever have. This is the fault of computer scientists, but from long before MS was around.

"I think it's funny that in a day of 100+ Gig HDD's this argument is over 24 Meg."

I'm not really sure what you mean by that. A 100GB decimal drive is equivalent to 93.13GB binary which is a loss of almost 7GB binary per 100GB capacity, not 24MB.
 

Trevelyan

Diamond Member
Dec 10, 2000
4,077
0
71
So... uh... let me see if I get this right...

1 gigabyte = 1,000 megabyte = 1,000,000 kilobytes = 1,000,000,000 bytes

And the whole "1 gigabyte = 1024 megabytes" is just something that Microsoft OS's report the space to be? So where did that number 1024 come from and why is Microsoft using it? Is the actual truth: 1 gibibyte = 1024 megabytes correct?

 

Pink0

Senior member
Oct 10, 2002
449
0
0
it's because there are 8 bits in a byte. 8x128=1024. 8bits in a byte. in reality 1000 bytes in a kilobyte and 1000 kilobytes in a gigabyte but microsoft uses 1024 of each because it's a multipe of 8 since you're measuring bytes which are 8 bits. Get it? Some file systems don't use this convention though so if you take a file that's 1gigabyte on a windows box to say, a *nix terminal, it will actually report as being larger.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
"And the whole "1 gigabyte = 1024 megabytes" is just something that Microsoft OS's report the space to be?"

No, all x86 compatible hardware uses 1024 Megabytes = 1 Gigabyte to determine hard drive capacity, for some reason Pink seems to think MS is the one that created this problem, but they are just using what the hardware tells it. ATA drives determine capacity by using binary numbers, which is why every ATA hard drive capacity barrier is in some way linked to a power of 2. When I boot my system the Adaptec SCSI controller detects 18GB drives as 17GB, the add-in Promise ATA controller also reports GiB, both of these initializing before MS has any say in what the system is doing. The problem begins at the hardware level, so if you want to blame someone blame the people that came up with the ATA and SCSI standards.

"it's because there are 8 bits in a byte. 8x128=1024. 8bits in a byte."

Huh? 1024 comes from 2^10. It has nothing to do with bits vs bytes.

2^10 bytes = 1024 bytes = 1 kilobyte
2^20 bytes = 1,048,576 bytes = 1024 kilobytes = 1 megabyte
2^30 bytes = 1,073,741,824 bytes = 1,048,576 kilobytes = 1024 megabytes = 1 gigabyte

"in reality 1000 bytes in a kilobyte and 1000 kilobytes in a gigabyte"

1000 kilobytes in a megabyte, not gigabyte