C++ function?

UnderScore

Senior member
Oct 9, 1999
216
0
0
Is there a C++ function to convert a ASCII character into its binary equivalent?

For example the @ sign is ASCII code# 64 & its binary equivalent is 1000000.

I need to be able to read in a string from the command prompt, separate the string into chars, then convert each char into its ASCII'ed binary equivalent.

Does anyone know if a function exists that will help me out?

Thanks ahead of time,

James
 

bigjon

Senior member
Mar 24, 2000
945
0
0
I don't know of a single C++ function that will do it, but hopefully this helps:

void OutputIntBinary(ostream &Out,int i)
{
const int BitCnt = 8*sizof(int);
char Str[BitCnt + 1];

for (int j = 0; j < BitCnt; ++j
{
Str[BitCnt-j] = (i &amp; 1) ? '1':'0';
i >>= 1;
}
Str[BitCnt] = 0;
Out << Str;
}


I dug this up at Experts Exchange. For programming they have a ton of good stuff - I recommend joining them ;)
 

Pretender

Banned
Mar 14, 2000
7,192
0
0
yes, you can.
use atoi to conver the string to an int.
then use itoa to convert it to a string (you can choose to have it formatted in base 2 (e.g. binary)).

=====================
Info:


char *_itoa( int value, char *string, int radix );

Routine, Required Header, Compatibility
_itoa, <stdlib.h>, Win 95, Win NT


Libraries

LIBC.LIB Single thread static library, retail version
LIBCMT.LIB Multithread static library, retail version
MSVCRT.LIB Import library for MSVCRT.DLL, retail version

Return Value

Each of these functions returns a pointer to string. There is no error return.

Parameters

value

Number to be converted

string

String result

radix

Base of value; must be in the range 2 ? 36

=====================
So use it with radix = 2, and you'll be set.
BTW, this is from MSVC++, so if you need the header file or the libraries, just PM me.