[Rumor, Tweaktown] AMD to launch next-gen Navi graphics cards at E3

Page 79 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

maddie

Diamond Member
Jul 18, 2010
5,204
5,614
136
Do please enlighten us how a 500w PSU can only output 500w yet can take in as much as 575w from the wall (based off the minimum spec for 80 plus rating which starts at 85%) where oh where have these electrons magically dissappeared to?
In the case of a power supply, it converts wall voltage (high V AC) input to a lower V DC output. The switching and rectifying circuits use part of the input energy to do this work. This part used by the PSU to do the conversion ends a up as wasted heat. This is why some PSUs have more efficient circuits that reduce this loss, but in the macro world, nothing is 100% efficient when you change states, so we always pay a price to do useful work and this energy used ends up as heat. The electrons don't disappear, they have to overcome electrical resistance, etc in the switching circuits and this is the energy lost by the power supply which is released as heat as in the circuits get hotter than ambient, and radiate this to the surroundings.

This portion of the output sent to the CPU gets fully (99% +) converted to heat in doing the computations by the CPU. There is no other form of energy being released.

That's why I asked, where do you think the energy ends up? It has to go somewhere as it sure isn't being stockpiled somewhere in the CPU.

The CPU for example does not radiate EM waves (light emission, radio waves, etc), does not produce sound energy, does not store any energy over time, as in no potential energy being stockpiled anywhere, not doing mass creation, etc.

TDP is a an estimate of what a user should plan for to properly cool a CPU. It should not be taken as what a CPU is using at any point in time, so saying that a CPU can use more than the TDP or less has nothing to do with what happens to the energy.
 

ubern00b

Member
Jun 11, 2019
171
75
61
So cpu's don't do this? they take 95w electricity then should convert it into 95w of heat expelled by a CPU? now you're just repaeting what I have been saying, good backtrack though
 

Hitman928

Diamond Member
Apr 15, 2012
6,754
12,500
136
The CPU for example does not radiate EM waves (light emission, radio waves, etc), does not produce sound energy, does not store any energy over time, as in no potential energy being stockpiled anywhere, not doing mass creation, etc.

This part is not 100% true, EM waves are emitted by CPUs, but as you mentioned with your 99+% comment, it's an insignificant portion of the power compared to that which is wasted as heat.

TDP is a an estimate of what a user should plan for to properly cool a CPU. It should not be taken as what a CPU is using at any point in time, so saying that a CPU can use more than the TDP or less has nothing to do with what happens to the energy.

The confusion comes in from modern boost mechanics as well as some tricks intel and AMD have been playing with TDP for a few years. Without getting too into the physics of it, TDP should be taken as the average power use over a sustained load over a thermally significant amount of time for a given baseline frequency. If you provide more cooling than the listed TDP, both AMD and intel chips will boost beyond that baseline frequency thereby increasing the power use until the limits of the chip or the cooling solution are reached, whichever comes first.

So unless you turn off all the boosting functions, a 65 W CPU will consume more than 65 W given a cooling solution that can handle more than 65 W. If the cooling solution can only handle 65 W, that's the most the CPU will consume over a significant amount of time with a sustained load.
 

maddie

Diamond Member
Jul 18, 2010
5,204
5,614
136
This part is not 100% true, EM waves are emitted by CPUs, but as you mentioned with your 99+% comment, it's an insignificant portion of the power compared to that which is wasted as heat.



The confusion comes in from modern boost mechanics as well as some tricks intel and AMD have been playing with TDP for a few years. Without getting too into the physics of it, TDP should be taken as the average power use over a sustained load over a thermally significant amount of time for a given baseline frequency. If you provide more cooling than the listed TDP, both AMD and intel chips will boost beyond that baseline frequency thereby increasing the power use until the limits of the chip or the cooling solution are reached, whichever comes first.

So unless you turn off all the boosting functions, a 65 W CPU will consume more than 65 W given a cooling solution that can handle more than 65 W. If the cooling solution can only handle 65 W, that's the most the CPU will consume over a significant amount of time with a sustained load.
I knew someone was going to split hairs. Joking :)
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,227
126
You know a PSU efficency rating means it takes more energy in than it outputs in heat? hence 80%/85%/87%/92% etc?

This is incorrect. ALL of the power that enters a power supply is, EVENTUALLY, turned into heat energy. (Except for the minicule amount given off in light, and as mechanical energy to turn fans.)

Some of the AC power (hence the efficiency rating) is turned into heat, and the REST, powers the PC (as DC current). Of that DC current, pretty-much all of it, with the aforementioned exceptions, GETS TURNED INTO HEAT.
 
  • Like
Reactions: guachi

maddie

Diamond Member
Jul 18, 2010
5,204
5,614
136
So cpu's don't do this? they take 95w electricity then should convert it into 95w of heat expelled by a CPU? now you're just repaeting what I have been saying, good backtrack though
What?

This is what you wrote. You said that a substantial part does not get turned into heat. In your example, 28%.

"Let me break it down in simple terms. Ryzen 2600 needs a cooler that can dissipate 65w of thermal (heat) power away from the processor. In reality the processor can actually use more than 65w of energy probably closer to 90w of energy, though it only needs a cooler that can dissipate 65w of thermal as not all of what goes into the processor is turned into heat."
https://forums.anandtech.com/thread...vi-graphics-cards-at-e3.2564009/post-39853347

By the way GPUs, also being computational circuits have exactly the same behavior as CPUs.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,227
126
Let me break it down in simple terms. Ryzen 2600 needs a cooler that can dissipate 65w of thermal (heat) power away from the processor. In reality the processor can actually use more than 65w of energy probably closer to 90w of energy, though it only needs a cooler that can dissipate 65w of thermal as not all of what goes into the processor is turned into heat.
This is completely false. (The premise that a CPU can consume 90W, and only dissipate 65W of heat.) (Edit: Assuming that the CPU doesn't have an internal power-storage device like a re-chargable battery, of course.)

Edit: I'm talking overall average, stead-state.

If 90W (steady-state) is going into the CPU, then it is going to need 90W of cooling (steady-state).

Where power draw differs from TDP, is not some magical issue where the power is consumed and goes no-where, it's basically a sort of "law of averages" thing. If a CPU temporarily spikes to 90W power draw, it only will barely budge TDP upwards. The problem is when the CPU is under a stead, heavy, load, and consumes 90W continuously, then it will need 90W of cooling.
 
Last edited:
  • Like
Reactions: guachi

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
Conservation is true the 100% of the time, the CPU must disipate most of the energy in form of heat, yes, but some goes away to other i/o devices, remember that data signals is energy. So some of the energy going in the cpu ends up being turned intro heat somewhere else. The RAMs are basically tons of capacitors that recives energy coming from the cpu and is constantly being disipated for example. There must be a way to calculate this, but i have no idea how, so i dont know how significant it is.
 

insertcarehere

Senior member
Jan 17, 2013
712
701
136
You cannot draw ANYTHING from what AMD has provided us, unless you are extremely biased towards Nvidia, or you are extremely biased Against AMD.

In other words, AMD stated that 40CU Navi GPU is using 23% less power than 40 CU Vega GPU. We haven't seen 40 Vega GPU. But we have seen 56 CU Vega GPU, and that it has 225W power draw. Could it be possible that Navi has LOWER power draw, than this?

In reality? No. You cannot draw ANY conclusions from what AMD provided about power efficiency of those GPUs.

In reality all of us have those same slides to read the tea leaves through. That same slide where Navi draws 23% less power than 'Vega 64' with 40CU also has it perform 14% better than presumably the same 40CU Vega 64 (Nothing in the slide notes clearly refute this). Add to the fact that we have no idea what clock speeds both GPUs are being run at, and the sleight-of-hand trickery of the "performance per area" measure on the same slide (using a Vega 56 as the 'Vega 10' which isn't fully enabled), not to mention the 8+6 pin config on the 2070-rivalling 5700xt. It's much easier to make a good argument that Turing retains the Perf/watt crown than Navi taking it.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Yes, but you have not read the caveats. 40 CU chip was 14% faster than 64CU Vega, and 40 CU Navi chip was using 23% less power THAN 40 CU VEGA chip.

This might have been a typo. There's no way it could perform 14% faster than a 40CU Vega while at the same time performing like RTX 2070.
 

beginner99

Diamond Member
Jun 2, 2009
5,320
1,768
136
So you do not have any proof for that, AMD has shown best case scenario, and this is only your assumption.

An assumpion with lots of empirical proof backing it up and 0 proof of what you claim (AMD sandbagging).
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Where power draw differs from TDP, is not some magical issue where the power is consumed and goes no-where, it's basically a sort of "law of averages" thing.

Yup. That's a good summary.

The confusion comes from two sources:
1. Intel/AMD trying to change the meaning to look good.
2. Older processors with less advanced thermal management


#1. When it says its a TDP of 95W, then in average, as VirtualLarry points out it has to consume no more than 95W. If it doesn't, it'll screw up thermal design. In Desktops people really oversize the HSF so its capable of handling way above 95W. The manufacturers exploit this by marking it on the box as "95W" when it can easily go above this for sustained periods. A bad example of that is called Coffeelake.

If you pull the same shenanigans in laptops, then you end up with throttling or overheating laptops and/or manufacturers will simply tell you its wrong.

Intel went as far as coercing the press to put out articles justifying their practice. https://www.anandtech.com/show/13544/why-intel-processors-draw-more-power-than-expected-tdp-turbo

A very long-winded way of saying "Our CPUs are really using PL2 power but we'll just put the 95W on the box".

They are blaming the motherboard manufacturers there but the source is very likely Intel. Problems always come from the top.

So now we have people again claiming that "TDP is not power consumption" when it should be.

#2. Back in the old days, you could rate the CPU at 95W, but certain workloads can exceed that if its demanding enough. Some still have that mindset and hasn't moved on.

Modern CPUs with sophisticated power management allow a 95W CPU to use no more than 95W. If the company is 100% honest about it, a 95W CPU may "burst" to 150W, but over time it has to come back down.

And let's go back to Navi discussion shall we?
 

DrMrLordX

Lifer
Apr 27, 2000
23,226
13,304
136
Does the NDA for Navi lift on 7-7-19?

Actually it may lift on July 1st. Most folks are thinking that's when the Matisse NDA lifts since that is the official begin date for pre-orders. Unless AMD isn't taking pre-orders for Navi . . .
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
This might have been a typo. There's no way it could perform 14% faster than a 40CU Vega while at the same time performing like RTX 2070.
Read it Once Again. Slowly.


40 CU Navi GPU was 14% faster than Vega 64 with 64 CU's. And at the same time, 40 CU Navi chip used 23% less power than Vega64 cut down to 40 CU's.
9hzKjvtwj7ivNaJ7.jpg

RX - 362, RX - 365. RX 362 talks about Power draw, RX 365 talks about performance.
 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
An assumpion with lots of empirical proof backing it up and 0 proof of what you claim (AMD sandbagging).
Its funny, because I have not once written that AMD is sandbagging. Maybe you know more than me?

Also. Try to explain the AMD's footnotes about performance and power draw, against Vega 64.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
In reality all of us have those same slides to read the tea leaves through. That same slide where Navi draws 23% less power than 'Vega 64' with 40CU also has it perform 14% better than presumably the same 40CU Vega 64 (Nothing in the slide notes clearly refute this). Add to the fact that we have no idea what clock speeds both GPUs are being run at, and the sleight-of-hand trickery of the "performance per area" measure on the same slide (using a Vega 56 as the 'Vega 10' which isn't fully enabled), not to mention the 8+6 pin config on the 2070-rivalling 5700xt. It's much easier to make a good argument that Turing retains the Perf/watt crown than Navi taking it.
You claim that you read those slides when you clearly have not read them.

Amd said clearly in the notes: 40 CU Navi chip was 14% faster than Vega 64 with 64 CUs, and used 23% less power than Vega 64 with 40 CU's enabled.

Again, what can you draw from this about its power consumption?
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
RX - 362, RX - 365. RX 362 talks about Power draw, RX 365 talks about performance.

Fair point.

However, why would they compare using a 40 CU part for power consumption and use a 64 CU one for performance? And use that to claim 1.5x? It's a weird thing to intentionally disable it for comparison.

They messed up something in the footnotes.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
I let the quotes speak for themselves.
It is only you who is reading it in this post. I asked a simple question. What makes you believe they have shown best case scenario? It does not have any meaning behind it, unless you want to read something behind it.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Fair point.

However, why would they compare using a 40 CU part for power consumption and use a 64 CU one for performance? And use that to claim 1.5x? It's a weird thing to intentionally disable it for comparison.

They messed up something in the footnotes.

I don't think they are. They are just listing out the performance and power consumption test separately as they were most likely handled by different people, which would not be unusual in an R&D environment. It would explain why different PC's were used. Yes, its not the best way to test things, but nobody ever claimed R&D was efficient, and they most likely were doing this last moment.
 

insertcarehere

Senior member
Jan 17, 2013
712
701
136
Read it Once Again. Slowly.


40 CU Navi GPU was 14% faster than Vega 64 with 64 CU's. And at the same time, 40 CU Navi chip used 23% less power than Vega64 cut down to 40 CU's.
9hzKjvtwj7ivNaJ7.jpg

RX - 362, RX - 365. RX 362 talks about Power draw, RX 365 talks about performance.

Read the notes and the slides referencing them yourself, the tests were clearly not done in an apples-to-apples manner since the test rig for power-draw (RX - 358) is a different rig from the one testing for performance (RX - 365). Heck, they don't even reference the card in the former as a RX 5700 XT, merely as a 'Navi_10' chip with 40CUs.
David_Wang-Next_Horizon_Gaming-Radeon_Architecture_06092019_17_575px.jpg
If what you said is actually true (40 CU Navi GPU was 14% faster than Vega 64 with 64 CU's. And with the same clocks, the same 40 CU Navi chip used 23% less power than Vega 64 cut down to 40 CU's.) AMD are not being upfront with the information, to say the least, and it certainly does not explain a 8+6pin OEM 5700XT when 3/4s of a 40CU Vega at sane clocks should be a ~150w card, if even that.
 
Status
Not open for further replies.