I was in lecture today and we were discussing the interesting topic of bandwidth. We went through some examples. One of them involved making a conversion from bits to bytes but he was moving from megabits to kilobytes. The first thing he did was divide by 1000 and then he divided by 8. According to him, if you're moving around in bits, you divide or multiply by 1000 but if you're moving in bytes, you use 1024. Well, in my experience, you use 1024 no matter what. It really doesn't make any sense to me.
Time for an example.
We start with 8 million bits. I'm going to move to mega and then convert to bytes. 8000000/1000/1000 = 8/8 = 1MB
Now we do it in the other order. 8000000/8 = 1000000/1024/1024 = 0.95367MB
This really seems like bad math to me. Can anybody explain this?
Time for an example.
We start with 8 million bits. I'm going to move to mega and then convert to bytes. 8000000/1000/1000 = 8/8 = 1MB
Now we do it in the other order. 8000000/8 = 1000000/1024/1024 = 0.95367MB
This really seems like bad math to me. Can anybody explain this?