protocols that transmit hex value's in ascii

Status
Not open for further replies.

ilyal

Junior Member
Feb 19, 2009
8
0
0
hi,
i was wondering why is it that many protocols transmit hex values in coded in ascii?
it seems very wasteful, you use 8-bits of data to transmit 4 bits, and i doubt its useful for reliability, i'm pretty sure that just sending the data twice is more robust.

the only advantage i can see is you can connect to a device using such protocol from a terminal, is that the only reason?

also, is there a way to estimate how much energy is wasted worldwide because of this stupid practice?
 

TecHNooB

Diamond Member
Sep 10, 2005
7,458
1
76
hi,
i was wondering why is it that many protocols transmit hex values in coded in ascii?
it seems very wasteful, you use 8-bits of data to transmit 4 bits, and i doubt its useful for reliability, i'm pretty sure that just sending the data twice is more robust.

the only advantage i can see is you can connect to a device using such protocol from a terminal, is that the only reason?

also, is there a way to estimate how much energy is wasted worldwide because of this stupid practice?

what do u mean transmit in ascii? send the letter A over my line? its 1 or 0 :)

maybe an example would help
 
Last edited:

ikachu

Senior member
Jan 19, 2011
274
2
81
I'm pretty sure he means sending 'A' as an 8-bit ASCII character (01000001) vs just sending 'A' as a 4-bit value (1010)

I can't think of any protocols off the top of my head that do that though.
 

greenhawk

Platinum Member
Feb 23, 2011
2,007
1
71
i was wondering why is it that many protocols transmit hex values in coded in ascii?

Depends on the method of communications. Serial will allow it, but then several of the hex values are reserved for software communications (ie: stop sending, continue sending, start of message, end of message).

For ethernet, I think it is more of a assurance that the data is of a known format (ie: meaning is not implied, it is mentioned).

ie: if I send "10", you assume it is base 10, but I could have sent it thinking base 2, base 8, base 16 or anything like that.

If you wanted to go on about "wasted" bits, then the next issue is why use 8 bits when the standard ascii table is only 128 characters long (2^7). It is generally always 8 bits for simplier comminucations between different machines and to allow for the extended ascii tables that include non-english characters.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
It depends on the application. I have written plenty of firmware for devices that only uses 3 bits to send text . There are some really odd cpu platforms in the embedded world. When faced with using things like 6 bit and 14 bit cpus you learn to cut corners everywhere you can.

The reason it uses 8 bits now is because is it is a legacy issue. The same reason binary doesn't account for tri-state logic, it didn't exist when the binary language was created. There was only on or off. If it did exist at the time you might have computers that operate completely different from how they do now.
 
Status
Not open for further replies.