- Mar 15, 2003
- 2,157
- 82
- 91
So we have basically nothing but 64-bit (x64) CPUs nowadays, replacing the previous generation of 32-bit (x86) CPUs.
What determines what instruction set CPUs and software use for a given timeframe? I mean, why not just skip straight to 128-bit for example?
Go ahead and be technical if you like.
The only thing I can think of explaining it is that our ability to pack "parts" onto CPU (i.e. our currently technology) literally affects the instruction sets we are capable of dealing with on consumer-level CPUs. Also: once the hardware determines the instruction set capabilities, software settles in and invests in it, and so it sits for a generation. Then later, we get better tech, and can pack more stuff onto the same size CPU, and it reaches a point where the industry as a whole shifts over, software and all.
Am I even remotely close?
What determines what instruction set CPUs and software use for a given timeframe? I mean, why not just skip straight to 128-bit for example?
Go ahead and be technical if you like.
The only thing I can think of explaining it is that our ability to pack "parts" onto CPU (i.e. our currently technology) literally affects the instruction sets we are capable of dealing with on consumer-level CPUs. Also: once the hardware determines the instruction set capabilities, software settles in and invests in it, and so it sits for a generation. Then later, we get better tech, and can pack more stuff onto the same size CPU, and it reaches a point where the industry as a whole shifts over, software and all.
Am I even remotely close?
Last edited: