cliftonite
Diamond Member
Is there a way to manipulate the subscript of an array , so that I can have b [0]['a'] = something? ( i want the column subscript to be all the ascii characters).
Originally posted by: Tweak155
you want to input a character (or ascii values) into the location of a 2 dim array? or you want to go by characters as location?
Originally posted by: cliftonite
Originally posted by: Tweak155
you want to input a character (or ascii values) into the location of a 2 dim array? or you want to go by characters as location?
The contents of the array will be integers. So A[1]['a']=2. So i need to set the characters as location.
Originally posted by: Tweak155
Originally posted by: cliftonite
Originally posted by: Tweak155
you want to input a character (or ascii values) into the location of a 2 dim array? or you want to go by characters as location?
The contents of the array will be integers. So A[1]['a']=2. So i need to set the characters as location.
yeah i dont see why not, try A[0][atoi(character)]=?
Originally posted by: TheLonelyPhoenix
This post is probably in the wrong forum, but w/e.
As I understand your question, you want to use characters as parameters for arrays, instead of integers. This is not possible in the sense you are describing. Arrays take numbers as subscripts, not arbitrary characters. Computers are like that.
Now, you could use some behind-the-scenes cleverness. If you typecast a char (which happens to be a lowercase letter) as an int and subtract 97, it turns that letter into its corresponding number in the alphabet, starting with 0. Thus, int ('a') - 97 would equal 0, int ('b') - 97 would equal 1, etc. But you cant throw a char directly at the array, or it will read the ASCII code of your letter instead. Thus, the computer wouldn't see array[0]['a'], but instead array[0][97]. That would be bad for you, most likely.
Close, a character is a byte (8 bits), int's are 32-bits for all the compiler's I've used. Also, you'd have to apply an offset to get your array indexing to start at 0 with the letter 'a'Originally posted by: manly
In C, a character is an int.
Check all the C library functions for character handling; they all take an int argument. Besides, I should have said integer instead.Originally posted by: RaynorWolfcastle
Close, a character is a byte (8 bits), int's are 32-bits for all the compiler's I've used. Also, you'd have to apply an offset to get your array indexing to start at 0 with the letter 'a'Originally posted by: manly
In C, a character is an int.
Basically addressing an array as A[3][0] and A[c-97][a-97] are the same as far as C is concerned.
Originally posted by: manly
Check all the C library functions for character handling; they all take an int argument.Originally posted by: RaynorWolfcastle
Close, a character is a byte (8 bits), int's are 32-bits for all the compiler's I've used. Also, you'd have to apply an offset to get your array indexing to start at 0 with the letter 'a'Originally posted by: manly
In C, a character is an int.
Basically addressing an array as A[3][0] and A[c-97][a-97] are the same as far as C is concerned.
It's simpler to apply the offset than to use atoi.
I'll take your word for it, but I don't know why a character would be a 32-bit int... an ASCII character is only 7+1 bits 😕Originally posted by: manly
Check all the C library functions for character handling; they all take an int argument.Originally posted by: RaynorWolfcastle
Close, a character is a byte (8 bits), int's are 32-bits for all the compiler's I've used. Also, you'd have to apply an offset to get your array indexing to start at 0 with the letter 'a'Originally posted by: manly
In C, a character is an int.
Basically addressing an array as A[3][0] and A[c-97][a-97] are the same as far as C is concerned.
It's simpler to apply the offset than to use atoi.
A char is a byte, and int is just the native word size of the platform.Originally posted by: Tweak155
Originally posted by: manly
Check all the C library functions for character handling; they all take an int argument.Originally posted by: RaynorWolfcastle
Close, a character is a byte (8 bits), int's are 32-bits for all the compiler's I've used. Also, you'd have to apply an offset to get your array indexing to start at 0 with the letter 'a'Originally posted by: manly
In C, a character is an int.
Basically addressing an array as A[3][0] and A[c-97][a-97] are the same as far as C is concerned.
It's simpler to apply the offset than to use atoi.
A character can store an integer, but they are different things, if they were the same, why would you need int and char? and why cant you store a character into int?
Originally posted by: yukichigai
Yeah, atoi would be my bet.
Originally posted by: cliftonite
Is there a way to manipulate the subscript of an array , so that I can have b [0]['a'] = something? ( i want the column subscript to be all the ascii characters).