Kilobits Vs. Kilobytes

BW86

Lifer
Jul 20, 2004
13,114
30
91
A Kilobit is one thousand bits. It is used to measure the amount of data transferred per second. Kilobits per second is shortened to kb/s, Kbps or kbps (as opposed to KBps, which is Kilobytes per second). The lowercase b is commonly used to denote bits, while the uppercase B is used for bytes.

1 kb/s = 1000 bits per second
1 KB/s = 1024 bytes per second
 

Toki

Senior member
Jan 30, 2004
277
0
0
Originally posted by: BW86
A Kilobit is one thousand bits. It is used to measure the amount of data transferred per second. Kilobits per second is shortened to kb/s, Kbps or kbps (as opposed to KBps, which is Kilobytes per second). The lowercase b is commonly used to denote bits, while the uppercase B is used for bytes.

1 kb/s = 1000 bits per second
1 KB/s = 1024 bytes per second

even if that was accurate, it doesnt answere the question.
 

Budman

Lifer
Oct 9, 1999
10,980
0
0
Kilobits is when Mr Kilo gets blown up by a bomb,you end up with kilobits.

Kilobytes is when Mr Kilo goes in the forest & is eaten by a big grizzly bear,then you have kilobites.


:)
 

bjc112

Lifer
Dec 23, 2000
11,460
0
76
Originally posted by: Budman
Kilobits is when Mr Kilo gets blown up by a bomb,you end up with kilobits.

Kilobytes is when Mr Kilo goes in the forest & is eaten by a big grizzly bear,then you have kilobites.


:)


8/10

;)
 

sonoma1993

Diamond Member
May 31, 2004
3,412
20
81
kilobytes can represent data transfer too. back in the days of 56k dial up days. usally downloads were between 3kilobytes to 6k kilobytes a second at the most.
 

Maxil223

Member
Nov 29, 2004
195
0
0
So Kilobytes is the size of a file and the Kilobit is the speed at which it is transfered.
 

biostud

Lifer
Feb 27, 2003
19,449
6,503
136
kbit/s and kbytes/s represent transfer speed

kbit and kbytes represent sizes
 

Jeff7

Lifer
Jan 4, 2001
41,596
19
81
Originally posted by: Maxil223
So Kilobytes is the size of a file and the Kilobit is the speed at which it is transfered.

By themselves, they are merely measures of size.
There are 4 cups in a quart.
There are 8 bits in a byte.
A bit is a 1 or a 0; 8 of them together make a byte, and in various combinations, are used to make up various characters. A character is something like a letter, number, or space.


Kilo- merely is a prefix for thousand; a kilobyte has long been held as being 1024, though that might be changed to 1000 now, with new standards that say (rightfully so) that kilo- should mean "1000" and not "1024".

At any rate, in terms of usage - bits are usually used when talking about thoughput speed on a network of some variety. 1Mbit = 1,000,000 bits. 200Kbps = 200,000 bits per second.
Bytes are usually used in terms of disk storage space - a file is referred to as being 15KB, not 120Kb. But, if you want to, you can say "I was downloading at 300KB (that's bytes) per second."


Hopefully that was more than you wanted to know. :D
 

SNM

Member
Mar 20, 2005
180
0
0
Originally posted by: Maxil223
So Kilobytes is the size of a file and the Kilobit is the speed at which it is transfered.

They're both just measures of size, like megabytes. The basic unit of data storage in a computer is the byte, which is made up of 8 bits.
1 kilobit is 1024 bits. 1 kilobyte is 1024 bytes, which comes out to 8192 bits.
As others have mentioned, kilobits (and then megabits) is usually used in determining data transfer rates, but it doesn't have to.
 

Fike

Senior member
Oct 2, 2001
388
0
0
The confusion for non programmer types comes from the fact that all commercial computer stuff would be better off measured in bytes. a bit, as people have described, refers to the famous ones and zeros of digital devices, but that is not interesting and useful to most computer users because file sizes are always in bytes. You could measure hard drives in bits, but the number would be huge.

It never made sense to me that modems and networks were measured in bits. That made it so that you couldn't easily do mental calculations to figure out how long a downoad would take.

The other place where bits are used is in bitrate of an MP3 or audio file, where they make much more sense. In that case, it is referring to a very small number like 128 K-bit per second. This number is more tied to the number of samples taken of music and is one component of a description of the quality of the MP3. 128 Kbps bit rate of an MP3 would be 16 KBps (kilo bytes per second)....furthermore, a 96 Kbps MP3 would be a 12 KBps file.....

....
actually....
.....

you know, it is the same. It would probably make sense for all consumer applications to only use bytes. But, well, everyone is accustomed to bits and bytes, and that makes it easier for marketing companies to obscure the facts, so we will have to stick with it.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
It never made sense to me that modems and networks were measured in bits. That made it so that you couldn't easily do mental calculations to figure out how long a downoad would take.

It makes a *lot* more sense to measure a serial communication device in bits/second than bytes/second, because it naturally transfers one bit at a time. Almost all networking devices are serial in nature, including modems, Ethernet, Fibre Channel, and 802.11 hardware. You'll note that parallel protocols like SCSI, ATA, and PCI are generally measured in MB/sec., since they transfer a number of bytes simultaneously per clock (SATA follows the old ATA convention, despite being serial).

Just divide your bitrate by 8 (or 10 if you're lazy, and then add a little bit to compensate) to get the speed in bytes/sec.
 

Fike

Senior member
Oct 2, 2001
388
0
0
Originally posted by: Matthias99
It never made sense to me that modems and networks were measured in bits. That made it so that you couldn't easily do mental calculations to figure out how long a downoad would take.

It makes a *lot* more sense to measure a serial communication device in bits/second than bytes/second, because it naturally transfers one bit at a time. Almost all networking devices are serial in nature, including modems, Ethernet, Fibre Channel, and 802.11 hardware. You'll note that parallel protocols like SCSI, ATA, and PCI are generally measured in MB/sec., since they transfer a number of bytes simultaneously per clock (SATA follows the old ATA convention, despite being serial).

Just divide your bitrate by 8 (or 10 if you're lazy, and then add a little bit to compensate) to get the speed in bytes/sec.


To a programmer, it makes perfect sense.

To a consumer, it makes no sense.

from a user's point of view, it is an irrelevant distinction.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Fike
Originally posted by: Matthias99
It never made sense to me that modems and networks were measured in bits. That made it so that you couldn't easily do mental calculations to figure out how long a downoad would take.

It makes a *lot* more sense to measure a serial communication device in bits/second than bytes/second, because it naturally transfers one bit at a time. Almost all networking devices are serial in nature, including modems, Ethernet, Fibre Channel, and 802.11 hardware. You'll note that parallel protocols like SCSI, ATA, and PCI are generally measured in MB/sec., since they transfer a number of bytes simultaneously per clock (SATA follows the old ATA convention, despite being serial).

Just divide your bitrate by 8 (or 10 if you're lazy, and then add a little bit to compensate) to get the speed in bytes/sec.


To a programmer, it makes perfect sense.

To a consumer, it makes no sense.

from a user's point of view, it is an irrelevant distinction.

All true. But you said you didn't know why modems and networking protocols were measured in b/sec. instead of B/sec., and I told you. :p