• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Discussion Intel current and future Lakes & Rapids thread

Page 270 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

A///

Senior member
Feb 24, 2017
816
571
106
@jpiniero I know Samsung had working prototypes of LPDDR5 like 2 years ago. Are there early production units available on the market for purchase?
 

A///

Senior member
Feb 24, 2017
816
571
106
That's pretty surprising. I was expecting products shipping soon, not 2 months from now. Especially since it seems like they are blowing most of their 10 nm capacity on this.
It should make for good holiday sales. Realistically, they're better off putting their resources into mobile if the margins are great for them. They'll have a good 4Q, too. It's perfect for the holidays period where people will be buying new laptops for themselves, a friend, family member, significant other, etc.
 

Exist50

Member
Aug 18, 2016
126
165
116

Ian says he was told the shipping Tiger Lake stepping does actually support LPDDR5. So it's not a validation issue, it's an OEM's don't want it issue.
I've been hearing the opposite, and the "architectural support" wording seems to suggest it's not currently supported in silicon.

@jpiniero I know Samsung had working prototypes of LPDDR5 like 2 years ago. Are there early production units available on the market for purchase?
Phones already support and ship with it.
 

IntelUser2000

Elite Member
Oct 14, 2003
7,165
1,738
136
Upon what known iGPU performance metrics are you basing this on?

Also, they list 2.8GHz as the base frequency which would also indicate cTDP of 28W, would it not?
The Acer device uses Intel's Dynaming Tuning, which uses "AI" to steadily reduce performance from higher PL levels rather than drop off the cliff as with predecessors. First introduced with Icelake.

The 28W PL1 configuration reduces to 17W over a ~10 min period. Some vendors treat PL1 values similar to PL2 values.
 
  • Like
Reactions: lightmanek

Hitman928

Platinum Member
Apr 15, 2012
2,914
2,453
136
The Acer device uses Intel's Dynaming Tuning, which uses "AI" to steadily reduce performance from higher PL levels rather than drop off the cliff as with predecessors. First introduced with Icelake.

The 28W PL1 configuration reduces to 17W over a ~10 min period. Some vendors treat PL1 values similar to PL2 values.
So it is configured to 28W but throttles to 17W over long sustained loads, it just tries to do so in a softer manner than previous models.
 

IntelUser2000

Elite Member
Oct 14, 2003
7,165
1,738
136
So it is configured to 28W but throttles to 17W over long sustained loads, it just tries to do so in a softer manner than previous models.
Yes, that explains the weird behavior, since the reviewer does not understand how it works. So he would keep running the tests one after another and subsequent tests would show lower scores. The other site with the Acer is getting 2100 points, while the site we are discussing is getting 2500.

17W is PL1, Acer just changes the name a bit.
 

Attachments

  • Like
Reactions: lightmanek

ondma

Golden Member
Mar 18, 2018
1,572
447
106
Yes, that explains the weird behavior, since the reviewer does not understand how it works. So he would keep running the tests one after another and subsequent tests would show lower scores. The other site with the Acer is getting 2100 points, while the site we are discussing is getting 2500.

17W is PL1, Acer just changes the name a bit.
That gradual power decrease (machine learning) actually seems counterproductive to me, depending on the workload. Seems like you would get the most work done by using up all the thermal budget before throttling down. For sure, if the task could be completed within the 18 seconds, you would be better off with the slide one scenario.
 

Hitman928

Platinum Member
Apr 15, 2012
2,914
2,453
136
That gradual power decrease (machine learning) actually seems counterproductive to me, depending on the workload. Seems like you would get the most work done by using up all the thermal budget before throttling down. For sure, if the task could be completed within the 18 seconds, you would be better off with the slide one scenario.
I assume that's why it's using the workload prediction algorithm. If the algorithm determines that the load will finish in 15-20 seconds, then it will still look like slide 1. If the algorithm determines the workload will last 10 minutes, it will look like slide 2. How well this works in real life, I guess we'll see. My gut reaction is that it won't really make a difference or may even have a negative effect on many workloads but that also probably just depends on how much more performance you are getting for that increase in power.
 
  • Like
Reactions: coercitiv

uzzi38

Golden Member
Oct 16, 2019
1,151
1,919
96
Well they're right. TDP isn't useful.

Especially now that Intel's base clocks appear to be rated at 28W and 9W (guesstimate) respectively. The one reason why TDP was useful was to give a baseline for base clocks. Now it can't even do that.
 
  • Like
Reactions: spursindonesia

Gideon

Golden Member
Nov 27, 2007
1,028
1,704
136
While AMD is somewhat better in this regard, It's really long overdue they created some new standard metric that is actually useful.

For instance replace TDP with 3 different metrics:
Watts for all-core-load @ base clocks
Watts for single-core peak turbo (only really needed in mobile)
Watts for worst case Vector workload
 
  • Like
Reactions: Tlh97 and uzzi38

uzzi38

Golden Member
Oct 16, 2019
1,151
1,919
96
While AMD is somewhat better in this regard, It's really long overdue they created some new standard metric that is actually useful.

For instance replace TDP with 3 different metrics:
Watts for all-core-load @ base clocks
Watts for single-core peak turbo (only really needed in mobile)
Watts for worst case Vector workload
Honestly I'd even just take the first two. The latter isn't all that useful for mobile at least (as both companies would fly way over sane-looking power values).
 
  • Like
Reactions: Tlh97 and Gideon

IntelUser2000

Elite Member
Oct 14, 2003
7,165
1,738
136
While AMD is somewhat better in this regard, It's really long overdue they created some new standard metric that is actually useful.
To be perfectly frank, no we don't.

People here know what the numbers are.

You are buying a laptop yes? Your decisions will be many, and one of them will be the CPU. The difference between 25W and even 45W CPUs aren't huge nowadays with designs overlapping each other.

If you are not buying a laptop and making this argument then you are just talking about it like we do lots here. Whether its Intel vs. AMD or pure curiosity, the result is the same.

So if the price is right for you, and the portability is ok too, then it doesn't matter what the TDP is as long as the system can cool it enough that long term performance is sustained. The final CPU might end up being a 15W one or a 45W one.

And TDP metric will never go away. Enough data will always exist for us enthusiasts/fanboys/speculators/fanfiction people. If they hide that data then Intel/AMD/Nvidia/etc will be obfuscating the data and we should complain.

So what are we arguing about in the first place?

Honestly I'd even just take the first two. The latter isn't all that useful for mobile at least (as both companies would fly way over sane-looking power values).
I also agree with this. The third isn't needed. Yes some will try but know if you are buying an ultrabook then you won't get everything.

Scratch that. I'm not sure why we need the single core watts either. Possibly the only time it matters is when you are running sustained load on battery. Otherwise it'll very likely be a bursty workload and go to sleep using mW. And single core TDP will be less demanding to thermals than multi-thread workload anyway.
 
Last edited:
  • Like
Reactions: Tlh97

Cardyak

Member
Sep 12, 2018
33
39
61
While AMD is somewhat better in this regard, It's really long overdue they created some new standard metric that is actually useful.

For instance replace TDP with 3 different metrics:
Watts for all-core-load @ base clocks
Watts for single-core peak turbo (only really needed in mobile)
Watts for worst case Vector workload
I agree, and as @uzzi38 already stated, the first two are the most important.

Honestly just be transparent with users and inform them of the maximum power draw when

A) Boosting on one core
B) Boosting on all cores

No need for PL1, PL2, and all the marketing conflation.

Just tell me the maximum power that will be drawn in boost scenarios and set that as your TDP.

For years now consumers have been deceived by ranges such as Coffee Lake and Comet Lake, where a CPU is identified as "35W" but that actually only applies to the base clock, and in reality when the CPU is boosting across all cores you end up with something closer to 90W. Just look at the 9900T, it's ostensibly rated as a 35W low power CPU, but when it's boosting across all 8 cores it draws a level of power comparable to a top-end desktop chip. It's ludicrous marketing that borders on outright duplicity.
 

teejee

Senior member
Jul 4, 2013
336
176
116
While AMD is somewhat better in this regard, It's really long overdue they created some new standard metric that is actually useful.

For instance replace TDP with 3 different metrics:
Watts for all-core-load @ base clocks
Watts for single-core peak turbo (only really needed in mobile)
Watts for worst case Vector workload
But TPD (Thermal Design Power) is intended to be used to design a proper cooling for a system. What you propose is a definition how to specify actual power draw in three different usage scenarios.
It would be great if this is specified for all CPU's but it shouldn't be called TPD.
It should rather be called something including "Power draw..."
 
  • Like
Reactions: geegee83 and Tlh97

IntelUser2000

Elite Member
Oct 14, 2003
7,165
1,738
136
@Cardyak For desktops you make total sense, and I think this is where the same thought comes from.

But on laptops it doesn't really matter. There are good designs and bad ones, even with similar price range, so designs will very much overlap each other. Ultimately they are limited by thermals so they are not allowed to really exceed PL1 TDP like they do on desktops. If they lie or, the CPU is set to have infinite PL1 duration like on desktop chips the laptop will thermal/power throttle and the performance ends up being no better.

\It would be great if this is specified for all CPU's but it shouldn't be called TPD.
It should rather be called something including "Power draw..."
I don't know why Intel said that nonsense because they are well aware themselves that TDP(specifically PL1 long load) is what determines how the cooling should be. It's just marketing being crap again like earlier today with the presentation.

What does it mean by Power Draw for laptops? Battery life? Clearly except under heavy load(which doesn't last more than 2 hours anyway) in battery life the CPU contributes very little because it goes from peak to sleep so often. That's how you get 10+ hours of battery life on modern devices.
 
Last edited:
  • Like
Reactions: Tlh97 and uzzi38

uzzi38

Golden Member
Oct 16, 2019
1,151
1,919
96
What does it mean by Power Draw for laptops? Battery life? Clearly except under heavy load(which doesn't last more than 2 hours anyway) in battery life the CPU contributes very little because it goes from peak to sleep so often. That's how you get 10+ hours of battery life on modern devices.
The CPU can contribute in how quickly it can ramp up and down alongside how quickly it can complete the short tasks waking the chip up from sleep, but otherwise agreed.
 

IntelUser2000

Elite Member
Oct 14, 2003
7,165
1,738
136
The CPU can contribute in how quickly it can ramp up and down alongside how quickly it can complete the short tasks waking the chip up from sleep, but otherwise agreed.
That concept is known as HUGI, or Hurry Up and Get Idle. A more technical term is Race to Halt.

It's also not as simple as that. For example, they said a low TDP chip like Atom, the fixed system power will dominate and there's less of an incentive to HUGI, because the CPU takes up less, and in that case its better to operate at an optimal frequency. But for high power chips the CPU can finish its task and lower total power use in a greater way.

This level will vary tremendously between loads. And if you are doing something that takes an hour, or playing games, this HUGI concept doesn't matter. In that case the only way to increase battery life is reducing the TDP significantly, which will also have a great impact on performance.

The average power used when they consider the sleep state in between bursty workloads is called, average power. :)

But people, especially enthusiasts don't like that term. Because the average power varies so much based on workload. TDP is nice, because its a fixed number we can rely on.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
16,494
5,474
136
That's pretty surprising. I was expecting products shipping soon, not 2 months from now. Especially since it seems like they are blowing most of their 10 nm capacity on this.
I thought products were shipping this month? And that was essentially a two month delay already.
 

ASK THE COMMUNITY