Originally posted by: BW86
A Kilobit is one thousand bits. It is used to measure the amount of data transferred per second. Kilobits per second is shortened to kb/s, Kbps or kbps (as opposed to KBps, which is Kilobytes per second). The lowercase b is commonly used to denote bits, while the uppercase B is used for bytes.
1 kb/s = 1000 bits per second
1 KB/s = 1024 bytes per second
Originally posted by: Budman
Kilobits is when Mr Kilo gets blown up by a bomb,you end up with kilobits.
Kilobytes is when Mr Kilo goes in the forest & is eaten by a big grizzly bear,then you have kilobites.
![]()
Originally posted by: Maxil223
So Kilobytes is the size of a file and the Kilobit is the speed at which it is transfered.
Originally posted by: Maxil223
So Kilobytes is the size of a file and the Kilobit is the speed at which it is transfered.
It never made sense to me that modems and networks were measured in bits. That made it so that you couldn't easily do mental calculations to figure out how long a downoad would take.
Originally posted by: Matthias99
It never made sense to me that modems and networks were measured in bits. That made it so that you couldn't easily do mental calculations to figure out how long a downoad would take.
It makes a *lot* more sense to measure a serial communication device in bits/second than bytes/second, because it naturally transfers one bit at a time. Almost all networking devices are serial in nature, including modems, Ethernet, Fibre Channel, and 802.11 hardware. You'll note that parallel protocols like SCSI, ATA, and PCI are generally measured in MB/sec., since they transfer a number of bytes simultaneously per clock (SATA follows the old ATA convention, despite being serial).
Just divide your bitrate by 8 (or 10 if you're lazy, and then add a little bit to compensate) to get the speed in bytes/sec.
Originally posted by: Fike
Originally posted by: Matthias99
It never made sense to me that modems and networks were measured in bits. That made it so that you couldn't easily do mental calculations to figure out how long a downoad would take.
It makes a *lot* more sense to measure a serial communication device in bits/second than bytes/second, because it naturally transfers one bit at a time. Almost all networking devices are serial in nature, including modems, Ethernet, Fibre Channel, and 802.11 hardware. You'll note that parallel protocols like SCSI, ATA, and PCI are generally measured in MB/sec., since they transfer a number of bytes simultaneously per clock (SATA follows the old ATA convention, despite being serial).
Just divide your bitrate by 8 (or 10 if you're lazy, and then add a little bit to compensate) to get the speed in bytes/sec.
To a programmer, it makes perfect sense.
To a consumer, it makes no sense.
from a user's point of view, it is an irrelevant distinction.