Question Speculation: RDNA2 + CDNA Architectures thread

Page 67 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,565
5,572
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

kurosaki

Senior member
Feb 7, 2019
258
250
86
View attachment 29903
The front fins are tied into the vapour chamber, but the stack height is much smaller than the rear fan. Even if the airflow wasn't obstructed by the PCB, you would need greater airflow to provide the same cooling through the front fin array, which means more RPM and thus more noise. The airflow is obstructed by the PCB and the shroud though, and has to take a hard 90 and travel in that thin stack between the fan and the board, along all the high z-height components, and the up and around the HDMI ports and out the back. If you tried to provide equal cooling, the exhaust air will be a loud, high velocity turbulent mess and the fan will be spinning like a banshee trying to force it all through while the back fan spins slow and easy. The best compromise for low acoustics would be to have both fans run in a way that produces the same amount of noise, as long as thermal constraints on all the subsystems are met.

In the case of an axial fan, the air is always obstructed by the PCB, and has to take a "90 degree turn", the same applies to AMD's "new" cooler. The only thing we have to focus on is how much total heat is going out either way, not the amount of air in CFM. If the CFM in the back fan is half of the front, it may still carry the same amount of heat energy due to absorbing more heat under time of contact with the fins. They would not have put a fan there if the amount of heat dissipated was miniscule. Of course this would be optimal in a positive pressure case, and of course a negative pressure case could have a very negative impact on this type of setup. As I see it, the more heat you could push out as fast as possible, the better. There is nothing positive recirculating hot air inside the case. Nvidias half'n half aproach still makes a ton of sense.
 

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,683
136
Of course this would be optimal in a positive pressure case, and of course a negative pressure case could have a very negative impact on this type of setup. As I see it, the more heat you could push out as fast as possible, the better.
Again with this. Air pressure inside the case doesn't matter in this scenario, whatever air is pushed outside through the back vents does so due to pressure from the video card fan.

Nvidias half'n half aproach still makes a ton of sense.
Nvidia's approach does make a lot of sense, but not for the reasons you invoke. Air going out through the back of the case is only a fraction of the air that cools the card. It's a good idea to let air go out that route, but this does not significantly change the amount of heat that the video card dumps inside the case.

The biggest strength of this cooler design still relies on the unobstructed second fan, which prevents air temperature rising in the lower part of the case, since most of the card heat is immediately pushed up.
 

kurosaki

Senior member
Feb 7, 2019
258
250
86
Again with this. Air pressure inside the case doesn't matter in this scenario, whatever air is pushed outside through the back vents does so due to pressure from the video card fan.


Nvidia's approach does make a lot of sense, but not for the reasons you invoke. Air going out through the back of the case is only a fraction of the air that cools the card. It's a good idea to let air go out that route, but this does not significantly change the amount of heat that the video card dumps inside the case.

But of course the pressure level inside the case is going to inflict the performance of the GPU trying to push air out the back. If you have a couple of large exhaust fans in the case, but skipped the intake fans completely, which I have had in the past due to a low noise build, the computer is going to try to suck air from whatever hole there is (Dust in USB-ports etc). The GPU is not going to perform optimal as if you had 4x 14cm fans on the intake, and one or two 12cm exhaust fans "Positive pressure". In that case, we would help the Gpu push air outside, not risking to suck any of it into the case again.

The biggest strength of this cooler design still relies on the unobstructed second fan, which prevents air temperature rising in the lower part of the case, since most of the card heat is immediately pushed up.

I agree, the air going out in the back is going to be relatively low in volume, but hotter than the air from the front fan. The front fan is going to be able to push a higher volume at a given time, and therefor not pick up as much heat per volume. so the inside fan is going to produce air which is a couple of degrees hotter, but in lager amount, and the back fan is going to push out a smaller volume than the front, but that air probably have absorbed more heat under time of fin contact and is going to be several degrees hotter.

If no reviewer is going to FLIR the shit out of this cooler, i'm willing to sacrifice myself! :D
 
Last edited:
  • Like
Reactions: lightmanek

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,683
136
The front fan is going to be able to push a higher volume at a given time, and therefor not pick up as much heat per volume. so the inside fan is going to produce air which is a couple of degrees hotter, but in lager amount, and the back fan is going to push out a smaller volume than the front, but that air probably have absorbed more heat under time of fin contact and is going to be several degrees hotter.
That is just poor understanding of physics. Amount of heat absorbed by air molecules per unit of time scales linearly with temperature delta between air and heatsink surface. You do not remove the same amount of heat by running air slower through a heatsink just because air has time to get hotter. Running air slower through a heatsink is subject to massive diminishing returns, that is unless the heatsink is doing a very poor job at transferring heat.

If you have a couple of large exhaust fans in the case, but skipped the intake fans completely, which I have had in the past due to a low noise build, the computer is going to try to suck air from whatever hole there is (Dust in USB-ports etc). The GPU is not going to perform optimal as if you had 4x 14cm fans on the intake, and one or two 12cm exhaust fans "Positive pressure".
The video card fan locally creates far more pressure that negates the dynamics of the case for the purpose of pushing air out the back. Moreover, a negative pressure case with fans pushing air out the top and air being allowed to come in through the bottom would actually be quite beneficial for the Nvidia cooler.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Just wanted to say I can't believe J2C was such a child that he decided to leak sensitive images because he was annoyed by a red line.

Bloody hell man. Bloody hell.
He did make it obvious that AMD have basically taken an Nvidia 2xxx founders edition cooler, stuck Radeon over the Nvidia logo and given it the "gamer" look (i.e. all those stealth fighter angles and etched lines also found in "gamer" laptops and monitors).

(nicked from DannyD @ guru3d)
1600252000435.png
 
Last edited:

Karnak

Senior member
Jan 5, 2017
399
767
136
He did make it obvious that AMD have basically taken an Nvidia 2xxx founders edition cooler, stuck Radeon over the Nvidia logo and given it the "gamer" look (i.e. all those stealth fighter angles and etched lines also found in "gamer" laptops and monitors).
He didn't care about Turing's FE in that regard so...

d974-01-0.jpg
 

kurosaki

Senior member
Feb 7, 2019
258
250
86
at absorbed by air molecules per unit of time scales linearly with temperature delta between air and heatsink surface. You do not remove the same amount of heat by running air slower through a heatsink just because air has time to get hotter.



Running air slower through a heatsink is subject to massive diminishing returns, that is unless the heatsink is doing a very poor job at transferring heat.

*The "front" fan is acting almost in the same way a fan for a rad or CPU tower would do. Pushing air through 1-3 cm fins at a rather high pace with quite little obstruction besides the fins. The "Back fan" is acting as a regular GPU-fan that has to push the air out at an angle that in itself causes a pressure loss. The air is as you said, going to be warmer, does that mean diminishing returns? Of course not. You put it out yourself. The dissipation of heat is a linear function of the delta. as long as the delta is kept reasonably high, the cooling function is going to be good, does that mean it's going to be as effective as the one pushing air straight through the fins? No. But we cant have a PCB-less card now, can we?

Again, Nvidia wouldn't have put a fan on the back for diminishing returns *

🤦‍♂️ I'll just have to buy a 30xx to prove a point now, dont I?
 
Last edited:

GodisanAtheist

Diamond Member
Nov 16, 2006
6,716
7,005
136
He didn't care about Turing's FE in that regard so...

d974-01-0.jpg

- I had the old XFX 7950 Ghost that followed this design and it was by far the nicest looking card I've ever owned.

A shame companies like XFX and Evga have moved away from the clean understated look to the cluttered gamer jank look.

As far as RDNA2, I don't mind the shroud design, although it does look a bit like a poor man's founder's edition. Some renders if the 2 fan cooler (which was the original teaser shot of the shroud way back when) and it looks perfectly serviceable as well.
 

blckgrffn

Diamond Member
May 1, 2003
9,110
3,028
136
www.teamjuchems.com
- I had the old XFX 7950 Ghost that followed this design and it was by far the nicest looking card I've ever owned.

A shame companies like XFX and Evga have moved away from the clean understated look to the cluttered gamer jank look.

As far as RDNA2, I don't mind the shroud design, although it does look a bit like a poor man's founder's edition. Some renders if the 2 fan cooler (which was the original teaser shot of the shroud way back when) and it looks perfectly serviceable as well.

Honestly, I really like the clean look of my "stock" blower 5700xt with the slight bend to the shroud and the backlit RADEON text. It looks great in my case. I have all the text on my mobo set to match in Radeon Red and got my ARGB fans set to "please the 12 year old version of myself" mode because pandemic.
 

eek2121

Platinum Member
Aug 2, 2005
2,904
3,906
136
AMD’s 3080 equivalent should edge out the 3080 by between 5 and 20% (depending on the game) if all of the various tidbits of information I have are correct.

Don’t expect a big price difference between either card.

AMD also apparently has something for the 3090, but it isn’t dropping yet.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
AMD’s 3080 equivalent should edge out the 3080 by between 5 and 20% (depending on the game) if all of the various tidbits of information I have are correct.

Don’t expect a big price difference between either card.

AMD also apparently has something for the 3090, but it isn’t dropping yet.
Lol. Please. AMD can't even beat their own previous gen cards how will they beat Nvidia. RX580 still is much better than 5500XT overpriced card. Maybe AMD will beat 3090 with RDNA 3 with 5nm but I still doubt.
 
  • Haha
Reactions: spursindonesia

uzzi38

Platinum Member
Oct 16, 2019
2,565
5,572
146
Lol. Please. AMD can't even beat their own previous gen cards how will they beat Nvidia. RX580 still is much better than 5500XT overpriced card. Maybe AMD will beat 3090 with RDNA 3 with 5nm but I still doubt.
Damn, you got RDNA2 figured out, AMD were never intending on beating the 5700XT, they can't beat their last gen card after all.
 

eek2121

Platinum Member
Aug 2, 2005
2,904
3,906
136
Lol. Please. AMD can't even beat their own previous gen cards how will they beat Nvidia. RX580 still is much better than 5500XT overpriced card. Maybe AMD will beat 3090 with RDNA 3 with 5nm but I still doubt.
I don’t normally feed trolls, but to which cards are you referring to?

You don’t have to believe me, but be prepare to be disappointed. Keep in mind I nailed the 3080 positioning many months ago.
 

blckgrffn

Diamond Member
May 1, 2003
9,110
3,028
136
www.teamjuchems.com
If you don't mind me asking, are you a small biz retailer?

Not to derail the thread earlier, but I can't seem to PM you - yes. PM if you want/need more details, but as life gets stressful these forums are my go to for escapism. :cool:

Thankfully, there is some action going on. Earlier this spring/summer it was way to early to get excited about these launches, but we are in the thick of it now!

FWIW, the 5500 xt in my mind was always supposed to be the replacement for the RX 460/560 which it of course obliterates.

Given we never had full stack Vega refresh, we milked those for so long....

The RX 480/580 when they came out were about $200-$240 in 2016 and so are more the 5600 xt segment. The RX R9 380s and 390s are so competitive that... wait, you can't buy those anymore?

Polaris and the islands cards were around for so darn long. Man.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
I don’t normally feed trolls, but to which cards are you referring to?

You don’t have to believe me, but be prepare to be disappointed. Keep in mind I nailed the 3080 positioning many months ago.
I don't know what you mean but AMD fans always hype things up to absurd degree and it always falls wrong. All the Vega hype killed it. It would be better if you said biggest Navi will not be able to beat 3070 so that when it comes close to 3080 atleast that will be a positive surprise.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,248
136
Lol. Please. AMD can't even beat their own previous gen cards how will they beat Nvidia. RX580 still is much better than 5500XT overpriced card. Maybe AMD will beat 3090 with RDNA 3 with 5nm but I still doubt.


I don't know what you mean but AMD fans always hype things up to absurd degree and it always falls wrong. All the Vega hype killed it. It would be better if you said biggest Navi will not be able to beat 3070 so that when it comes close to 3080 atleast that will be a positive surprise.

And your role in this is what? Twisting nipples? Why not do your part and refrain from speaking unneeded nonsense? Waiting for the official announcement/reviews before passing judgement is what the logical person will do.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
And your role in this is what? Twisting nipples? Why not do your part and refrain from speaking unneeded nonsense? Waiting for the official announcement/reviews before passing judgement is what the logical person will do.
No sorry you're right. I just got a bit mad when he mentioned AMD has a 3090 competitor as history has shown AMD is unable to be competitive at the hi end since many years of promising and failing. But yes i agree lets wait for reviews and benchmarks before passing judgment.
 

blckgrffn

Diamond Member
May 1, 2003
9,110
3,028
136
www.teamjuchems.com
And your role in this is what? Twisting nipples? Why not do your part and refrain from speaking unneeded nonsense? Waiting for the official announcement/reviews before passing judgement is what the logical person will do.

Who wants their nipples twisted!?!?

Ah, this called forth the mustache ride scene from Super Troopers. 😂

I don’t want my nipples twisted, fwiw, but thanks for the chuckle.
 

Glo.

Diamond Member
Apr 25, 2015
5,657
4,409
136
I don't know what you mean but AMD fans always hype things up to absurd degree and it always falls wrong. All the Vega hype killed it. It would be better if you said biggest Navi will not be able to beat 3070 so that when it comes close to 3080 atleast that will be a positive surprise.
It doesn't take 150 IQ to figure out that a GPU with 100% more CUs, 25% higher core clocks, and at least 50% more memory bandwidth than RDNA1 GPU(RX 5700 XT) will be much faster than 45% faster GPU(RTX 2080 Ti), than that RDNA1 GPU.

How would a GPU with over 2X performance of RX 5700 XT lose to RTX 3070, which won't even be faster than RTX 2080 Ti, using RTX 3080 reviews, and scaling of SM's, is beyond me.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
It doesn't take 150 IQ to figure out that a GPU with 100% more CUs, 25% higher core clocks, and at least 50% more memory bandwidth than RDNA1 GPU(RX 5700 XT) will be much faster than 45% faster GPU(RTX 2080 Ti), than that RDNA1 GPU.

How would a GPU with over 2X performance of RX 5700 XT lose to RTX 3070, which won't even be faster than RTX 2080 Ti, using RTX 3080 reviews, and scaling of SM's, is beyond me.

He isn't saying it will perform like 3070, he is just saying it would be better to think that, so when it delivers ~3080 performance it will feel like a pleasant surprise.

If instead you start by thinking it's going to deliver 3090 performance, then ~3080 performance will be a disappointment.
 
  • Like
Reactions: Martimus

Glo.

Diamond Member
Apr 25, 2015
5,657
4,409
136
He isn't saying it will perform like 3070, he is just saying it would be better to think that, so when it delivers ~3080 performance it will feel like a pleasant surprise.

If instead you start by thinking it's going to deliver 3090 performance, then ~3080 performance will be a disappointment.
It won't perform like RTX 3080, unless you talk about 72 CU version that is cut down from N21. Then maybe.
 
  • Like
Reactions: Tlh97

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
It won't perform like RTX 3080, unless you talk about 72 CU version that is cut down from N21. Then maybe.

No question 80 CUs could beat if 3080 if you ignore power usage.

But we have no idea how much AMD will have throttle 80 CUs to keep power in check.

You can't simply double Navi 10, without considering that also doubles power usage to 450 W. That's a lot of power to trim with minimal help from the process.

Marketing and Rumors are not iron clad truth to answer that question either.

The final proof when the cards are finalized and in third party hands. This is going to all come down to power usage, and how it is kept in check.