How difficult would it be to have different voltage for each stage in a processor? Slower ones could receive higher volts to keep up and fast ones wouldn't take as much power, but I don't even know if this is possible with current processor design techniques/technologies.
Would it be easier to decouple frequencies on certain things like decode(say, if it kept the rest of the chip from clocking higher or volting lower) and just run a wider decode to make up for it?
Imagine how efficient that could be if every stage could run at a certain ratio of voltage that only it nfeeded for a certain frequency...
The chip would still have a limited overclock by how much voltage the slowest stage could handle, but it would use less power during that time.
They would even be able to get better temperature readings, because they could place the sensors next to the highest consuming part and know that's the limit.
I bet AMD could realize Bulldozer's full potential with a combination of the two(6ghz+ too much to ask on certain parts of the chip? Prescott ran the ALU at 8Ghz+ IIRC). 4ghz isn't bad considering the slowest stage is limiting.
Would it be easier to decouple frequencies on certain things like decode(say, if it kept the rest of the chip from clocking higher or volting lower) and just run a wider decode to make up for it?
Imagine how efficient that could be if every stage could run at a certain ratio of voltage that only it nfeeded for a certain frequency...
The chip would still have a limited overclock by how much voltage the slowest stage could handle, but it would use less power during that time.
They would even be able to get better temperature readings, because they could place the sensors next to the highest consuming part and know that's the limit.
I bet AMD could realize Bulldozer's full potential with a combination of the two(6ghz+ too much to ask on certain parts of the chip? Prescott ran the ALU at 8Ghz+ IIRC). 4ghz isn't bad considering the slowest stage is limiting.
Last edited:
