TaylorMade
Junior Member
I am trying to find the bridge between actual application-level calls down to the instruction set commands that the CPU processes.
Theoretically, I should be able to map one SQL query (for example), break it down into everything that a query actually does with a database, and map that into the number of instructions that the CPU actually processes.'
If I were able to figure that out, I could accurately size CPU requirements with the figures to back it up, rather than "guesstimating".
I know about hardware, CISC vs. RISC, and/or gates, clockspeed, cycles, and the classic equations of computing execution time, and have done the research from the application side, but I'm missing the link where the two MATHEMATICALLY meet.
Any help?
Theoretically, I should be able to map one SQL query (for example), break it down into everything that a query actually does with a database, and map that into the number of instructions that the CPU actually processes.'
If I were able to figure that out, I could accurately size CPU requirements with the figures to back it up, rather than "guesstimating".
I know about hardware, CISC vs. RISC, and/or gates, clockspeed, cycles, and the classic equations of computing execution time, and have done the research from the application side, but I'm missing the link where the two MATHEMATICALLY meet.
Any help?