- Jul 22, 2000
- 4,694
- 0
- 0
8 bits = 1 byte, so 12mbits = 1.5mbytes, but not really ... isn't 1mbyte = 1,024,000 bytes? So then how many mbytes is 12mbits?
Originally posted by: Maverick2002
8 bits = 1 byte, so 12mbits = 1.5mbytes, but not really ... isn't 1mbyte = 1,024,000 bytes? So then how many mbytes is 12mbits?
Originally posted by: dullard
1 megabyte is 1000 *1000 bytes = 1,000,000 bytes.
1 mibibyte is 1024*1024 bytes = 1,048,576 bytes.
Originally posted by: Maverick2002
Umm ok, so following that FAQ, I'm still confused. I don't get the math with 12mbits. Ok so it's 1.5mbytes, or 1,500,000 bytes. Then what? How do I know how many REAL megabytes there are?
Originally posted by: Maverick2002
Umm ok, so following that FAQ, I'm still confused. I don't get the math with 12mbits. Ok so it's 1.5mbytes, or 1,500,000 bytes. Then what? How do I know how many REAL megabytes there are?
Originally posted by: Maverick2002
Umm ok, so following that FAQ, I'm still confused. I don't get the math with 12mbits. Ok so it's 1.5mbytes, or 1,500,000 bytes. Then what? How do I know how many REAL megabytes there are?
No, he's correct. See the FAQOriginally posted by: VBboy
Originally posted by: dullard
1 megabyte is 1000 *1000 bytes = 1,000,000 bytes.
1 mibibyte is 1024*1024 bytes = 1,048,576 bytes.
What are you smoking.
1 MB (Megabyte) = 1024 * 1024 bytes
Please read the FAQ, you are making a very common mistake. 1 MiB (Mibibyte) = 1024*1024 bytes. 1 MB (Megabyte) = 1000*1000 bytes. Since the words are so close, most people have switched them around and use the incorrect definition. Even the Windows operating system uses both versions.What are you smoking.
1 MB (Megabyte) = 1024 * 1024 bytes
All the necessary and correct answers are in my first post:Umm ok, so following that FAQ, I'm still confused. I don't get the math with 12mbits. Ok so it's 1.5mbytes, or 1,500,000 bytes. Then what? How do I know how many REAL megabytes there are?
but that's just 1.5/1.024 ... doesn't make sense, cause then that would mean 1megabyte = 1,024,000bytes.1,500,000 Bytes = 1.46484375 "real" megabytes.
Then sadly you have been teaching it wrong. Look it up in computer books, or dictionaries, or the Anandtech FAQ. I know that mibibyte and megabyte are easy to confuse, so as a teacher you should make it very clear to your students.Originally posted by: VBboy
I teach Computer Conversion From Bits to Bytes at MIT. I know wtf I'm talking about![]()
Originally posted by: dullard
Please read the FAQ, you are making a very common mistake. 1 MiB (Mibibyte) = 1024*1024 bytes. 1 MB (Megabyte) = 1000*1000 bytes. Since the words are so close, most people have switched them around and use the incorrect definition. Even the Windows operating system uses both versions.What are you smoking.
1 MB (Megabyte) = 1024 * 1024 bytes
All the necessary and correct answers are in my first post:Umm ok, so following that FAQ, I'm still confused. I don't get the math with 12mbits. Ok so it's 1.5mbytes, or 1,500,000 bytes. Then what? How do I know how many REAL megabytes there are?
12 mbits = 12 megabits = 1.5 megabytes = 1.5 Mbytes = 1.5 MB = 1.5 *1000 * 1000 bytes = 1,500,000 bytes.
Simple answers: all hard drive manufacturers use the 1000*1000 definition of MB. Parts of Windows uses it too. So those are a handful of major companies that use it. However you are right 99.9% of individuals use the other incorrect definition, and yes part of Windows uses the wrong one too.dude that's the first time i've heard of "mibi" etc. are u sure that's used these days?! i am pretty sure when most people say megabytes, they mean 2^20 bytes. i can't remember seeing any differently.
A lot of people mean 2^20 when they say mega. The correct mathematical definition of mega that has been used since before computers were around is 10^6 it's just that most people use it incorrectly nowadays.dude that's the first time i've heard of "mibi" etc. are u sure that's used these days?! i am pretty sure when most people say megabytes, they mean 2^20 bytes. i can't remember seeing any differently.
Originally posted by: dullard
Simple answers: all hard drive manufacturers use the 1000*1000 definition of MB. Parts of Windows uses it too. So right off the back those are a handful of major companies that use it.dude that's the first time i've heard of "mibi" etc. are u sure that's used these days?! i am pretty sure when most people say megabytes, they mean 2^20 bytes. i can't remember seeing any differently.
Originally posted by: Maverick2002
but that's just 1.5/1.024 ... doesn't make sense, cause then that would mean 1megabyte = 1,024,000bytes.1,500,000 Bytes = 1.46484375 "real" megabytes.
Which confuses me even more thanks to Dullard's post. Or are the answers just really close because it's a small number?