So basically, we have to design a logic circuit that does some stuff, and I know how to do everything except the very beginning part. The beginning part needs to convert decimal numbers (0-15) to an 8 bit binary number. We can only use 74 series IC's for it so keep that in mind.
I don't really want a straight up answer to the question since I want to be able to know how to do it myself, but I need some guidance into what to look at in terms of converting from decimal to binary. The same goes for how you would multiply numbers in binary. I'm thinking for that that I need to use some version of half/full adders...but I'm not completely sure.
Again, I don't want the answer, just direction into what to look at or a website that would help. Thanks in advance!!
I don't really want a straight up answer to the question since I want to be able to know how to do it myself, but I need some guidance into what to look at in terms of converting from decimal to binary. The same goes for how you would multiply numbers in binary. I'm thinking for that that I need to use some version of half/full adders...but I'm not completely sure.
Again, I don't want the answer, just direction into what to look at or a website that would help. Thanks in advance!!