- Mar 15, 2003
- 2,154
- 82
- 91
Warning: newb question ahead.
With two otherwise identical CPUs running at different frequencies, the CPU running at a higher frequency will always have a higher TDP (and obviously higher power consumption), correct?
Assuming the above to be correct, how do you explain, among other examples, the following?:
Core i7 920 (2.66 GHz)
vs.
Core i7 950 (3.06 GHz)
In the above example, we see one CPU running a full 400 MHz faster than its counterpart, but at the same TDP (130 W). Assuming this isn't an error or inaccuracy, please explain why this is the case.
With two otherwise identical CPUs running at different frequencies, the CPU running at a higher frequency will always have a higher TDP (and obviously higher power consumption), correct?
Assuming the above to be correct, how do you explain, among other examples, the following?:
Core i7 920 (2.66 GHz)
vs.
Core i7 950 (3.06 GHz)
In the above example, we see one CPU running a full 400 MHz faster than its counterpart, but at the same TDP (130 W). Assuming this isn't an error or inaccuracy, please explain why this is the case.