Question Speculation: RDNA2 + CDNA Architectures thread

Page 137 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,705
6,427
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

linkgoron

Platinum Member
Mar 9, 2005
2,409
979
136
I think the only reason AMD goes over 300W is if they intend to take the performance crown. Losing by 10% at 250W is better than losing by 5% at 320W.
That's not what they did with Vega. Vega was way out of the perf/watt curve just to make it more competitive with the 1080, which wasn't even the fastest card on the market at the time. Vega's perf/watt actually wasn't that bad with lower clocks (and indeed it's still working well for APUs).

I have a feeling that they'll probably release a liquid cooled Navi to either inch out the 3080 or 3090 anyway (depending on where the 6800XT's performance falls)
 
Last edited:

Graphenewhen

Junior Member
Oct 13, 2020
15
15
51
I'm all for AMD beating the 3090, even if I suspect Nvidia will do anything to retain the halo product benefit of having the fastest card in a generation. I half expect Nvidia to release a $2000 card if that's what it takes (I have no idea if the 3090 is a full die and at full speeds).

I hope the 6800XT is a solid competitor for the 3080, is that what seems likely?
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
Just leave it guys. Very first post on AnandTech forums... and it's just a flamebait :rolleyes:

Normally we welcome new members, but it's looking like an exception should be made this time. Probably a alter-ego anyways.

I wonder how many gamers pause and look around at reflections while playing? Seems like you could find these anomalies on just about any platform if one looks hard enough.
 

Hitman928

Diamond Member
Apr 15, 2012
6,187
10,694
136
What's a "cope"?

cope
noun

Definition of cope (Entry 2 of 4)

1 : a long enveloping ecclesiastical vestment The priest wore a cope for the benediction.

 

Veradun

Senior member
Jul 29, 2016
564
780
136
cope
noun

Definition of cope (Entry 2 of 4)

1 : a long enveloping ecclesiastical vestment The priest wore a cope for the benediction.

Now I'm even more confused, thanks :>
 

Hitman928

Diamond Member
Apr 15, 2012
6,187
10,694
136
Now I'm even more confused, thanks :>

I was just answering facetiously, pretty sure @DDH was using the verb tense of the word as a noun like the kids like to do these days. Here's a corrected statement to make it more clear (hopefully).

We're 7 days away and the attempts to cope on display here are reaching levels previously thought unattainable.
 
Last edited:

HurleyBird

Platinum Member
Apr 22, 2003
2,760
1,455
136
That's not what they did with Vega. Vega was way out of the perf/watt curve just to make it more competitive with the 1080, which wasn't even the fastest card on the market at the time. Vega's perf/watt actually wasn't that bad with lower clocks (and indeed it's still working well for APUs).

When you have an uncompetitive architecture, you clock it higher than you should.

Vega was always going to lose in perf/w, so sacrificing wattage to look less bad in performance benchmarks only made sense.

Exchanging a massive lead in perf/w and a small loss in absolute perf for a small lead in perf/w and a smaller loss in absolute performance is another matter altogether, and altogether a poor trade.

That said, I think the "Rage mode" thing would make sense in that AMD could trumpet their perf/w lead using one mode, and their performance metrics using the other. Best of both worlds.
 

Glo.

Diamond Member
Apr 25, 2015
5,803
4,777
136
When you have an uncompetitive architecture, you clock it higher than you should.

Ampere was always going to lose in perf/w, so sacrificing wattage to look less bad in performance benchmarks only made sense.

Exchanging a massive lead in perf/w and a small loss in absolute perf for a small lead in perf/w and a smaller loss in absolute performance is another matter altogether, and altogether a poor trade.
Fixed that for ya.

Im sorry...
 

PhoBoChai

Member
Oct 10, 2017
119
389
106
Normally we welcome new members, but it's looking like an exception should be made this time. Probably a alter-ego anyways.

I wonder how many gamers pause and look around at reflections while playing? Seems like you could find these anomalies on just about any platform if one looks hard enough.

I don't like attacking the man, but to his arguments about missing reflections and low resolution reflections, he should actually look at other games that implement RT reflections and notice... it's the same.

Nobody does 1:1 res reflections in games because even 1:2 and 1:4 res already kills performance. And not everything is going to be in the reflection either, due to the performance budgets, devs must pick and choose.

For the spider man example, the thing that ppl should be aware with that is the extreme distant reflections of the city scape. This is not normally done in other games due to distant reflections being extra demanding on bvh traversal calculations, ray depth impacts perf badly. The fact that its done on a PS5 console game, shows its very capable at accelerating bvh.
 

linkgoron

Platinum Member
Mar 9, 2005
2,409
979
136
When you have an uncompetitive architecture, you clock it higher than you should.

Vega was always going to lose in perf/w, so sacrificing wattage to look less bad in performance benchmarks only made sense.

I agree that (obviously) Navi 2x will end up as much more competitive than Vega, and the leadership in graphics (no Raja) is different now. Navi 10 was already competitive, and Ampere is one of the worst node-shrinks Nvidia has ever had. I'm just saying that AMD were willing to release Vega64 in balanced mode for a 1% advantage in 4k over the 1080, when power-saver mode was ~4% slower but 31% more efficient than Vega64 in balanced mode. I'm not really sure if it was worth it.

That said, I think the "Rage mode" thing would make sense in that AMD could trumpet their perf/w lead using one mode, and their performance metrics using the other. Best of both worlds.
I'm not a huge fan of "modes". I think that it makes the PR "message" more complex. People go into reviews and want to see a number... You can use multiple numbers in the Navi review, but it makes the review more confusing IMO. Even if it isn't - reviewers will take the "default" numbers in follow-up reviews for other graphics cards, making the card appear less performant, and I don't really see reviewers testing every AIB card in a rage mode and default mode.
 
  • Like
Reactions: Tlh97 and kurosaki

PhoBoChai

Member
Oct 10, 2017
119
389
106
I agree that (obviously) Navi 2x will end up as much more competitive than Vega, and the leadership in graphics (no Raja) is different now. Navi 10 was already competitive, and Ampere is one of the worst node-shrinks Nvidia has ever had.

Why do ppl disrespect Raja? Read the Anandtech article on his return to AMD. Mark Papermaster asked him to develop a post-GCN, brand new architecture. Very similar goals to Jim Keller being hired back to AMD, post-FX CPU new architecture, Zen.

Raja delivered them Navi/RDNA. Despite AMD going through very tough financial problems during the 2014 - 2017 period. Ppl can go back and look at forums then, lots of ppl actually expected AMD to go bankrupt. RTG didn't get good funding until the end of 2017 when Zen success returned AMD to profitable levels.

Shoestring budget, while iterating GCN and trying to create a new architecture is going to cause issues and delays. They even tried to back-port Primitive Shaders of RDNA into the last GCN, Vega, without success. But clearly you can see attempts were made to improve GCN in graphics performance.

RDNA2 is the result of AMD being in a great position again, money rolling in 2018 and onwards. Heard the tech press talk about the Zen team of Suzane Plummer heading over to RTG to help them develop RDNA 2? Extra R&D, more funding, and its going to show in RDNA2.

Raja went to Intel and ppl laughed it off, that Intel has no chance to be competitive in graphics since they were so far behind. Yet Tiger Lake with Xe iGPU is trading blows with Zen 2 APUs in performance. In a single generation Intel manage to come back from what, 3 gens behind on graphics performance?
 

zinfamous

No Lifer
Jul 12, 2006
111,165
30,117
146
Why do ppl disrespect Raja? Read the Anandtech article on his return to AMD. Mark Papermaster asked him to develop a post-GCN, brand new architecture. Very similar goals to Jim Keller being hired back to AMD, post-FX CPU new architecture, Zen.

Raja delivered them Navi/RDNA. Despite AMD going through very tough financial problems during the 2014 - 2017 period. Ppl can go back and look at forums then, lots of ppl actually expected AMD to go bankrupt. RTG didn't get good funding until the end of 2017 when Zen success returned AMD to profitable levels.

Shoestring budget, while iterating GCN and trying to create a new architecture is going to cause issues and delays. They even tried to back-port Primitive Shaders of RDNA into the last GCN, Vega, without success. But clearly you can see attempts were made to improve GCN in graphics performance.

RDNA2 is the result of AMD being in a great position again, money rolling in 2018 and onwards. Heard the tech press talk about the Zen team of Suzane Plummer heading over to RTG to help them develop RDNA 2? Extra R&D, more funding, and its going to show in RDNA2.

Raja went to Intel and ppl laughed it off, that Intel has no chance to be competitive in graphics since they were so far behind. Yet Tiger Lake with Xe iGPU is trading blows with Zen 2 APUs in performance. In a single generation Intel manage to come back from what, 3 gens behind on graphics performance?

I think people disrespect Raja mostly because he has a big stupid mouth that says very stupid things that make the marketing wing cringe, and they pretty much just have to pivot around and use his nonsense because it's already out there. Putting the cart before the horse sort of thing.

Also, Vega wasn't a good look for a while (mainly because Raja was very loud and very stupid about it), and whatever he had to do with RDNA, I think that will quickly filter through into forgotten territory over the next 2 or 3 generations.
 

linkgoron

Platinum Member
Mar 9, 2005
2,409
979
136
Why do ppl disrespect Raja? Read the Anandtech article on his return to AMD. Mark Papermaster asked him to develop a post-GCN, brand new architecture. Very similar goals to Jim Keller being hired back to AMD, post-FX CPU new architecture, Zen.

Raja delivered them Navi/RDNA. Despite AMD going through very tough financial problems during the 2014 - 2017 period. Ppl can go back and look at forums then, lots of ppl actually expected AMD to go bankrupt. RTG didn't get good funding until the end of 2017 when Zen success returned AMD to profitable levels.

Shoestring budget, while iterating GCN and trying to create a new architecture is going to cause issues and delays. They even tried to back-port Primitive Shaders of RDNA into the last GCN, Vega, without success. But clearly you can see attempts were made to improve GCN in graphics performance.

RDNA2 is the result of AMD being in a great position again, money rolling in 2018 and onwards. Heard the tech press talk about the Zen team of Suzane Plummer heading over to RTG to help them develop RDNA 2? Extra R&D, more funding, and its going to show in RDNA2.

I'm not sure I agree about his leadership skills. Vega was a terrible gaming card, HBM was a terrible bet in general for gaming cards and 4GB HBM1 was a death blow for Fury (not sure how much this was his fault, as this was ~2 years after he returned, but he still kept talking about HBM all the time) - but he's a well respected engineer, and maybe with unlimited funds at Intel he'll do better. However, I mostly dislike him because of marketing and his huge mouth, and he couldn't deliver on his marketing. Not saying it's all his fault, it's difficult combating both Intel and Nvidia with lower R&D funding, but marketing under his leadership was terrible.

Raja promised me and our readers that we “would be really really pleased.” We expect to see Polaris-based GPUs across the entire performance stack.

Never Happened.

Raja further expands on it, telling me that in order to make multi-GPU useful and productive for the next generation of APIs, getting multi-GPU hardware solutions in the hands of developers is crucial. He admitted that CrossFire in the past has had performance scaling concerns and compatibility issues, and that getting multi-GPU correct from the ground floor here is crucial.

Never Happened.

The naming scheme of Polaris (10, 11…) has no equation, it’s just “a sequence of numbers” and we should only expect it to increase going forward. The next Polaris chip will be bigger than 11, that’s the secret he gave us.

Never happened (Polaris 20/30 don't count, they were just Polaris 10 rehashes). IIRC there was actually a Polaris 12 that was smaller.


This is Polaris 10 and that’s Polaris 11. In terms of what we’ve done at the high level, it’s our most revolutionary jump in performance so far.

[...]

This is very early silicon, by the way. We have much more performance optimization to do in the coming months. But even in this early silicon, We’re seeing numbers versus the best class on the competition running at a heavy workload, like Star Wars—The competing system consumes 140 watts. This is 86 watts. We believe we’re several months ahead of this transition, especially for the notebook and the mainstream market. The competition is talking about chips for cars and stuff, but not the mainstream market.

Spoiler: RX 480 was less efficient than most of the Maxwell cards (significantly less efficient than the higher-end Maxwell cards). It was also not a revolutionary jump in performance. Note that "The competition" released Pascal which was a really excellent release, and technically to this day AMD doesn't really have a card the outright beats the 1080ti (just one more week though). In addition, he would post stuff about Vega months ahead of its launch. For example, this picture, more than a year before Vega 64 was released.

Raja went to Intel and ppl laughed it off, that Intel has no chance to be competitive in graphics since they were so far behind. Yet Tiger Lake with Xe iGPU is trading blows with Zen 2 APUs in performance. In a single generation Intel manage to come back from what, 3 gens behind on graphics performance?

We'll see, but Zen2 APUs are essentially are ~2 gens behind AMD's latest RDNA2 cores. Either way, the problem IMO isn't the hardware, but the software. The drivers will probably make or break Xe.

Also, still doing his marketing things:
 
Last edited:

Saylick

Diamond Member
Sep 10, 2012
3,532
7,859
136
Yeah, AMD just isn't the "All show, no go" enterprise that it used to be. Under Lisa, they let the engineering do the talking with little extraneous marketing fluff; nowadays, if AMD claims it, it's likely to be backed by some serious engineering. In other words, they put their money where their mouth is.

Intel seems to be more on the other end of the spectrum now. Raja loves hyping up technologies, even when they aren't ready or polished for the market, which I guess fits well with Intel's market position. If they want to slow down the bleeding, they need to do whatever it takes to convince users that there's something big looming on the horizon.