I know. Has the consensus around here changed about TDP is not equal to power consumption though? Intel's PL2 have a 56 second default duration which the AMD camp loves to see in reviews but then turn around and trash when it comes to these sort of discussion.
Personally I fail to understand why you keep trying to trigger people into an AMD discussion on this thread. First the remark about the AMD 142W PPT power limit being somehow equivalent to Intel's PL1 limit, and now a vague remark that TDP is not equal to power consumption combined with this gratuitous suggestion that the "AMD camp" uses PL2 limit as reference for Intel power consumption and PL1 as reference for performance benchmarks. I guess this has to be done, let's talk Intel vs. AMD power management.
Intel uses a combination of PL1&PL2, a timer and a temperature ceiling to determine boosting. This works well to extract the maximum amount of performance as early as possible, but has the small disadvantage of pushing temps to the limit with undersized cooling. Normally engineers tune the Tau timer on a specific OEM product to compensate, but DYI consumers do not. Overall Intel's boosting algorithm is predictable and dependable, but has grown old and cannot meet Intel's own expectations when it comes to extracting all the performance out of their CPUs, since the gap betwen PL2 and PL1 has grown so large that moving from one limit to the other leads to drastic changes in performance.Their
ABT mechanic is a move in the right direction, but should have been used years ago in their entire lineup.
AMD uses a combination of PPT and target temperature to determine boosting behavior. In a normal usage environment it is actually the temperature readings that dictate average power consumption and NOT the PPT limit. This "target temperature" is arguably a better method to dictate boost since it's directly correlated to cooler performance, and not indirectly via the use of a timer and an "upper temperature ceiling". The CPU has a target temperature and will fluctuate clocks around it while also avoiding the PPT limit. On average this extracts the best performance that the cooling allows for.
As far as the consensus on TDP == Power consumption, this has been explained repeatedly:
- For "proper" stock Intel platforms TDP == Average Power consumption. They are one and the same since TDP is dictated by PL1 limit, which is a power limit that nowadays is easy to reach. Power consumption is the basis for Intel's TDP definition.
- For AMD platforms TDP is NOT a function of power consumption, since this variable is not included in the TDP formula at all. AMD defines their TDP based on target temperature delta over ambient and reference cooler heat dissipating properties (thermal resistance).
From an engineering point of view, AMD's TDP approach is arguably better since it directly involves cooler properties that are relevant for system integrators and/or cooler makers. However,
from a consumer point of view, Intel's approach is easier to understand and therefore more relevant on a product sheet, or at least it was until they allowed mobo makers to ignore stock specs.
Both Intel and AMD managed to fail consumers with regard to their TDP ratings. Both of them (ultimately) disregarded the consumer's need to eyeball expected power consumption based on a simple indicator (or set of indicators). This is why this forum has gradually migrated from using TDP as measuring stick and increasingly relies on actual power & temperature measurements to understand cooling needs for a DYI system.
If Intel wants to increase TDP further for K SKUs then so be it. But at least enforce stock limits, and then allow consumers to optimize performance via enabling a feature similar to ABT. This makes stock behavior predicable and dependable again, while also allowing anyone to extract maximum performance with proper cooling.