what's 1GB equal to?

Lvis

Golden Member
Oct 10, 1999
1,747
0
76
If you're buying a harddrive, it's 1000, as they measure it.

In reality, its 1024
 

sash1

Diamond Member
Jul 20, 2001
8,897
1
0
In actuality its 1024Mb, but they really mean 1000Mb.

`K
 

DeviousTrap

Diamond Member
Jul 19, 2002
4,841
0
71
Originally posted by: Lazee
so google is wrong?

Technically, yes, but since hard drive makers use 1000mb, and their email boxes are obviously stored on a hard drvive they could be considered correct :p
 

Ionizer86

Diamond Member
Jun 20, 2001
5,292
0
76
Technically, a Gigibyte is 2^30 bytes and a Gigabyte is 10^3 bytes. But gigabyte has come to stand for 10^3, at least with Google and with the HDD makers, whereas Windows still reads a gig as 2^30 bytes.
 

Howard

Lifer
Oct 14, 1999
47,989
10
81
Originally posted by: Ionizer86
Technically, a Gigibyte is 2^30 bytes and a Gigabyte is 10^3 bytes. But gigabyte has come to stand for 10^3, at least with Google and with the HDD makers, whereas Windows still reads a gig as 2^30 bytes.
 

Zee

Diamond Member
Nov 27, 1999
5,171
3
76
Originally posted by: Howard
Originally posted by: Ionizer86
Technically, a Gigibyte is 2^30 bytes and a Gigabyte is 10^3 bytes. But gigabyte has come to stand for 10^3, at least with Google and with the HDD makers, whereas Windows still reads a gig as 2^30 bytes.

how nice. Perhaps we should have them change the definition of a foot to equal 10 inches... :disgust:
 

mugs

Lifer
Apr 29, 2003
48,924
45
91
Originally posted by: Ionizer86
Technically, a Gigibyte is 2^30 bytes and a Gigabyte is 10^3 bytes. But gigabyte has come to stand for 10^3, at least with Google and with the HDD makers, whereas Windows still reads a gig as 2^30 bytes.

Gibibyte, not gigibyte
 

mugs

Lifer
Apr 29, 2003
48,924
45
91
Originally posted by: Lazee
Originally posted by: Howard
Originally posted by: Ionizer86
Technically, a Gigibyte is 2^30 bytes and a Gigabyte is 10^3 bytes. But gigabyte has come to stand for 10^3, at least with Google and with the HDD makers, whereas Windows still reads a gig as 2^30 bytes.

how nice. Perhaps we should have them change the definition of a foot to equal 10 inches... :disgust:

Don't blame Microsoft, almost ALL computer users consider a gigabyte to be 2^30, the ones that don't are just being pedantic. The gibi/mebi/kibi/etc prefixes were created by anal-retentive scientists who were pissed off that computer users were using the SI prefixes improperly. Gibi stands for giga-binary and so on and so forth.
 

So

Lifer
Jul 2, 2001
25,921
14
81
Originally posted by: Ionizer86
Technically, a Gigibyte is 2^30 bytes and a Gigabyte is 10^3 bytes. But gigabyte has come to stand for 10^3, at least with Google and with the HDD makers, whereas Windows still reads a gig as 2^30 bytes.

10^3 bytes = kilobyte

you mean 10^9 bytes.
 

AyashiKaibutsu

Diamond Member
Jan 24, 2004
9,306
3
81
giga is the prefix for billion. so saying it's 1000 megabytes is right. However, in the computer world everything is done in powers of two so it's 2^30, which makes 1024 megabytes. This thread is the first I've heard of gibi.... and I'm a third year computer science major....
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
In base 2 it's 1024. Computers work on base 2. So in the computer world, a gigabyte is 1024. In base 10 it's 1000. Hard drive manufacturers use base 10 for some unknown reason (probably to inflate the sizes they report). The people that made up gibi are just trying to redefine math.
 

Chaotic42

Lifer
Jun 15, 2001
33,929
1,097
126
Originally posted by: n0cmonkey
In base 2 it's 1024. Computers work on base 2. So in the computer world, a gigabyte is 1024. In base 10 it's 1000. Hard drive manufacturers use base 10 for some unknown reason (probably to inflate the sizes they report). The people that made up gibi are just trying to redefine math.

I don't know.

One byte is 8 bits. Be it in base 2, 8, or 5000, a byte is 8 bits. One billion is the same quantity. One kilobyte is 1000 bytes or 8000 bits. If we're going to say that kilo- means one thousand, mega- one million, and giga- one billion, then 1 gigabyte is 1,000,000,000 bytes in any number system.
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: Chaotic42
Originally posted by: n0cmonkey
In base 2 it's 1024. Computers work on base 2. So in the computer world, a gigabyte is 1024. In base 10 it's 1000. Hard drive manufacturers use base 10 for some unknown reason (probably to inflate the sizes they report). The people that made up gibi are just trying to redefine math.

I don't know.

One byte is 8 bits. Be it in base 2, 8, or 5000, a byte is 8 bits. One billion is the same quantity. One kilobyte is 1000 bytes or 8000 bits. If we're going to say that kilo- means one thousand, mega- one million, and giga- one billion, then 1 gigabyte is 1,000,000,000 bytes in any number system.

No. 1024 base 2 = 1000 base 10.
 

AyashiKaibutsu

Diamond Member
Jan 24, 2004
9,306
3
81
base two is binary. 1024 decimal/base 10 = 1000000000 binary/base 2. That's why computers work in powers of two they make nice numbers in binary.
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: AyashiKaibutsu
base two is binary. 1024 decimal/base 10 = 1000000000 binary/base 2. That's why computers work in powers of two they make nice numbers in binary.

1000 in binary/base 2 = 1111101000
If you add 1111101000 in binary correctly, it should come out to about 1023.