[Rumor, Tweaktown] AMD to launch next-gen Navi graphics cards at E3

Page 138 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Seems like they will far exceed once you see games developed mainly for the new consoles and not the old ones, which is probably about a one year time delay from the new console launches. At least if the 9.2 TFLOPS on PS5 rumor is true. That's effecitvely a 5700 XT rumored, plus some hardware ray tracing support which they have officially promised. Kind of sounds like the closest equivalent to next gen consoles is going to be RTX 2070 Super since it's close to RX 5700 XT performance and has hardware RT (though likely far superior hardware RT than the PS5 gpus will have). That's a $500 card, ouch.

I think that's largely because games are designed for consoles with quite weak hardware. Sony was very conservative with their hardware choices after how much PS3 bombed early in the gen. But it sounds like they're aiming to put pretty good quality hardware into the next gen system.

I kind of fear RT's usage in the next gen consoles which could make even the RX 5700 obsolete. An an RX 5700 discounting RT isn't as powerful as what's rumored to go into the PS5. Just feels like a really bad time to be buying a gpu.
A 2070S cannot run games with RT at any reasonable frame rate at 1440p. Ray-tracing or lack thereof on the PC side does not make a graphics card obsolete any more than the color palette it can produce. Just because a console has a powerful GPU in it (perhaps 5700XT equivalent) does not mean that a 5700 is obsolete. You will still be able to game at 1440p on a 5700 at 60+fps for quite some time. That distinctly makes it not obsolete. You will likely still be able to game at 1080p60 with 40-60fps with an RX570 as well. Further, since AMD are working the GPUs on PS5/XSX, we could possibly expect better driver development/optimizations on the consoles to carry over to an extent to the GPU side of things.

I don't understand this idea that if a GPU isn't as powerful as the one going into consoles that suddenly it's going to be obsolete. Far from it. It runs the games it runs just fine. It doesn't have RT but I'd argue outside the 2080, neither does any Nvidia card.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
9,348
8,029
136
A 2070S cannot run games with RT at any reasonable frame rate at 1440p. Ray-tracing or lack thereof on the PC side does not make a graphics card obsolete any more than the color palette it can produce. Just because a console has a powerful GPU in it (perhaps 5700XT equivalent) does not mean that a 5700 is obsolete. You will still be able to game at 1440p on a 5700 at 60+fps for quite some time. That distinctly makes it not obsolete. You will likely still be able to game at 1080p60 with 40-60fps with an RX570 as well. Further, since AMD are working the GPUs on PS5/XSX, we could possibly expect better driver development/optimizations on the consoles to carry over to an extent to the GPU side of things.

I don't understand this idea that if a GPU isn't as powerful as the one going into consoles that suddenly it's going to be obsolete. Far from it. It runs the games it runs just fine. It doesn't have RT but I'd argue outside the 2080, neither does any Nvidia card.

I would call 40-60fps fps at 1080p struggling heavily. Being around 45 fps is one of the worst no man's lands you can be in since when you turn vsync on you alternate between one frame being rendered twice, the next once, the next twice, the next once, and so in in screen refreshes on a 60 Hz panel. I can't see how you'd buy an RX 570 in 2020 and expect it to keep up for two years. It can't even do it now at 1080p. AC Odyssey runs pretty horribly on an RX 570 so I don't see much potential in it with next gen titles. And $350 for an RX 5700 if it's weaker than the gpu the devs are all targeting first and foremost in the PS5 is a really hard sell. I'll pass.

Also considering how lousy PC ports often are, I do think a gpu that can brute force past the consoles is the only thing that makes much sense for PC gaming. Maybe the RX 5700 XT class gpu in PS5 is BS (as it was when it was claimed PS4 would have a 7870 XT level gpu which turned out to be 7850 level), but if the rumor is true it PC gaming a really hard sell based on how overpriced gpus are right now.
 
  • Like
Reactions: CHADBOGA

jpiniero

Lifer
Oct 1, 2010
17,153
7,535
136
I'd even go as far to suggest that developers might delay or not release titles on PC because only cards with HW RT will be able to have a playable framerate at all. That's one of the questions I have, will developers go all-in on RT to the point where you can't turn it off on a theoretical PC port.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Also considering how lousy PC ports often are, I do think a gpu that can brute force past the consoles is the only thing that makes much sense for PC gaming. Maybe the RX 5700 XT class gpu in PS5 is BS (as it was when it was claimed PS4 would have a 7870 XT level gpu which turned out to be 7850 level), but if the rumor is true it PC gaming a really hard sell based on how overpriced gpus are right now.

I feel like this is one of the biggest fails from the "AMD is in all consoles, so ports will be better! (bonus "especially for AMD users") in my opinion.

I'm grateful the console-exclusive era is over, but my god how terrible some of the ports have been. The worst offenders were almost broken for AMD users.

I'd even go as far to suggest that developers might delay or not release titles on PC because only cards with HW RT will be able to have a playable framerate at all. That's one of the questions I have, will developers go all-in on RT to the point where you can't turn it off on a theoretical PC port.

They'd just have on/off options, since not all consoles will have RT HW and these games will still support Xbox One/PS4 generations for at least half of the Gen 9 generation.
 

soresu

Diamond Member
Dec 19, 2014
4,244
3,748
136
I feel like this is one of the biggest fails from the "AMD is in all consoles, so ports will be better! (bonus "especially for AMD users") in my opinion.
Anyone who believed this to be true hadn't considered the depths of nVidia's pockets.

Owing to the financial effects of acquiring ATI, and the Bulldozer mess, those pockets were far deeper than AMD could possibly hope to match when it came to dev support and influence.

This generation could turn out to be quite interesting by comparison - it is unlikely they will ever reach nVidia's financial level now, but getting big enough to compete for dev attention doesn't seem a big stretch to me, so long as they continue an upward trajectory with hardware execution in Zen and RDNA.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Realtime Raytracing changed the game. nVidia forced Microsoft and Sony to adapt hardware support instead of producing just hardware upscaler. At the same time more and more (indie) developer will use Raytracing on the PC to increase quality and reduce production costs/time.

With Raytracing comes the need of DL upscaling and denoiser. Another area where nVidia is leading the industry with their hardware and software.

Basically we are at the iPhone 2007 timeline.
 

uzzi38

Platinum Member
Oct 16, 2019
2,747
6,657
146
Realtime Raytracing changed the game. nVidia forced Microsoft and Sony to adapt hardware support instead of producing just hardware upscaler. At the same time more and more (indie) developer will use Raytracing on the PC to increase quality and reduce production costs/time.

With Raytracing comes the need of DL upscaling and denoiser. Another area where nVidia is leading the industry with their hardware and software.

Basically we are at the iPhone 2007 timeline.

No we are not. Do you honestly believe that without Nvidia bringing RTX the consoles would have not had support for raytracing?

Sorry, let me rephrase: do you think that without Nvidia - who claim to have also spent years on preparing Raytracing technology to get it into a state they thought was passable for the general market - Sony and Microsoft in partnership with AMD, a company on the brink of financial ruin just a couple of years ago, will be have been able to completely switch over their plans for upcoming consoles to utilise raytracing.

Can I ask what you're smoking? Please?
If Nvidia were the main driving force behind Raytracing in the market, upcoming consoles would have never been able to provide HW RTRT. They simply didn't have the time and resources funnelled in to get the tech working in time. And this is assuming AMD are a company on the same sort of scale and R&D capabilities as Nvidia. Simply put - had it not already been the plan years ago, it would be impossible for upcomibg consoles to do HW RTRT.

But then again, if you believe that, you probably also believe Jensen when he says a 2080 Max-Q will be more powerful than upcoming consoles, which is an utter joke.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
No we are not. Do you honestly believe that without Nvidia bringing RTX the consoles would have not had support for raytracing?

Sorry, let me rephrase: do you think that without Nvidia - who claim to have also spent years on preparing Raytracing technology to get it into a state they thought was passable for the general market - Sony and Microsoft in partnership with AMD, a company on the brink of financial ruin just a couple of years ago, will be have been able to completely switch over their plans for upcoming consoles to utilise raytracing.

Can I ask what you're smoking? Please?
If Nvidia were the main driving force behind Raytracing in the market, upcoming consoles would have never been able to provide HW RTRT. They simply didn't have the time and resources funnelled in to get the tech working in time. And this is assuming AMD are a company on the same sort of scale and R&D capabilities as Nvidia. Simply put - had it not already been the plan years ago, it would be impossible for upcomibg consoles to do HW RTRT.

But then again, if you believe that, you probably also believe Jensen when he says a 2080 Max-Q will be more powerful than upcoming consoles, which is an utter joke.

I'm looking at it more like CUDA with a hint of G-Sync, from a marketing perspective.

CUDA in terms of NV put forward their hardware version specification first. As far as I know DXR is hardware agnostic. However, NV getting RTX out basically around the same time the standard (DXR) was announced, it gives them a 1-2 year lead over their competitors version that would use a more standard spec (ie variable refresh existed long before G-Sync was a product being marketed but most people would know it as G-Sync before Freesync/Adaptive Sync.) By getting developers to work with RTX I'm betting NV is hoping of having a CUDA effect where they'd stick to use NV's RTX extensions which wouldn't affect console games, but like we've seen for years with GameWorks throw huge monkey wrenches into AMD's product performance.

Just NV playing their usual games, and it's clearly working.

EDIT: typos galore!
 
  • Like
Reactions: GodisanAtheist

soresu

Diamond Member
Dec 19, 2014
4,244
3,748
136
No we are not. Do you honestly believe that without Nvidia bringing RTX the consoles would have not had support for raytracing?

Sorry, let me rephrase: do you think that without Nvidia - who claim to have also spent years on preparing Raytracing technology to get it into a state they thought was passable for the general market - Sony and Microsoft in partnership with AMD, a company on the brink of financial ruin just a couple of years ago, will be have been able to completely switch over their plans for upcoming consoles to utilise raytracing.

Can I ask what you're smoking? Please?
If Nvidia were the main driving force behind Raytracing in the market, upcoming consoles would have never been able to provide HW RTRT. They simply didn't have the time and resources funnelled in to get the tech working in time. And this is assuming AMD are a company on the same sort of scale and R&D capabilities as Nvidia. Simply put - had it not already been the plan years ago, it would be impossible for upcomibg consoles to do HW RTRT.

But then again, if you believe that, you probably also believe Jensen when he says a 2080 Max-Q will be more powerful than upcoming consoles, which is an utter joke.
This 100%.

The body of academic work on RT acceleration structures and hardware design is pretty extensive going back for years, and certainly not only nVidia employees are among the authors, or even credited on all the papers.
 

kapulek

Member
Oct 16, 2010
56
33
91
I like silly season. N21 is rumored to have 80 CUs ~505mm2, Computex announcement/launch. 100 CUs would be N23? Those clocks... Truckload of salt needed.

51a2ea1d178b5b5650fd5fd6acf75cfcf1de6a928e84819b1c298e1127afd723.jpg
 
Last edited:

soresu

Diamond Member
Dec 19, 2014
4,244
3,748
136
but like we've seen for years with GameWorks throw huge monkey wrenches into AMD's product performance.
Arguably it doesn't exactly do nVidia performance a favor in many cases either from what I've seen of friends with nVidia systens.

It simply isn't very well optimised in many cases, likely often added to games for sponsorship money from nVidia, rather than any avid desire of the game devs themselves for those features.

This is why the likes of GPUOpen is the best way to do things - examples given, open code, everyone benefits and can even improve on it with their own modifications.
 

soresu

Diamond Member
Dec 19, 2014
4,244
3,748
136
I like silly season. N21 is rumored to have 80 CUs ~505mm2, Computex announcement/launch. 100 CUs would be N23? Those clocks... Truckload of salt needed.
More likely an 80CU die would be the max at 505 mm2, and the other would be a smaller GPU, perhaps 52-56CU?

Assuming I'm right, It seems unlikely the smaller GPU will have less/same CU's than Navi 10.

80CU will likely have lower clocks than 5700XT, unless RDNA2 has dramatically improved mhz/watt efficiency over RDNA1.

Edit: unless AMD have pulled a fast one on us and made MCM GPU already, it seems extremely unlikely, but not impossible.

Adding another 10 CU's to Navi 10 (to 50 CU) would be close to 250mm2 at 7nm+, double that for a 2 die 100CU total MCM would be 505 ish mm2.

Then you recover 10 CU's on each die for non ideal yields to get the 80 CU SKU.
 
Last edited:

Head1985

Golden Member
Jul 8, 2014
1,867
699
136

uzzi38

Platinum Member
Oct 16, 2019
2,747
6,657
146
  • Like
Reactions: soresu and Glo.

NostaSeronx

Diamond Member
Sep 18, 2011
3,815
1,294
136
384-bit is the limit of GDDR6. If one wants more bandwidth they'll want to go 4096-bit HBM2E(each stack: 1024-bit) or 8192-bit HBM3(each stack: 2048-bit).
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Arguably it doesn't exactly do nVidia performance a favor in many cases either from what I've seen of friends with nVidia systens.

It simply isn't very well optimised in many cases, likely often added to games for sponsorship money from nVidia, rather than any avid desire of the game devs themselves for those features.

This is why the likes of GPUOpen is the best way to do things - examples given, open code, everyone benefits and can even improve on it with their own modifications.

Without a doubt. I've been in the NV ecosystem now for about 4 years, but truth be told some of those affects sold me into the G-Sync limitation AND ray tracing.

That is marketing at it's finest. Thanks to G-Sync I have no issues with RTX <60 FPS @ 3440x1440 in some games.

The sad thing is how much good will AMD is burning through all of this. At least with NV we know it's the 'Empire' and we're gonna get reamed. AMD is one good stone away from being David to NV's Goliath, but that stone keeps hitting their foot :/
 

soresu

Diamond Member
Dec 19, 2014
4,244
3,748
136
The sad thing is how much good will AMD is burning through all of this. At least with NV we know it's the 'Empire' and we're gonna get reamed. AMD is one good stone away from being David to NV's Goliath, but that stone keeps hitting their foot :/
I'd say they already hit that mark against Intel, alebit with the help of thei process tech foibles.

The R&D spending and engineering focus was only more recently reallocated to GPU efforts after the initial Zen push, so we may just be seeing the train warming up in the station rather than roaring down the track for now.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
I'm looking at it more like CUDA with a hint of G-Sync, from a marketing perspective.

CUDA in terms of NV put forward their hardware version specification first. As far as I know DXR is hardware agnostic. However, NV getting RTX out basically around the same time the standard (DXR) was announced, it gives them a 1-2 year lead over their competitors version that would use a more standard spec (ie variable refresh existed long before G-Sync was a product being marketed but most people would know it as G-Sync before Freesync/Adaptive Sync.) By getting developers to work with RTX I'm betting NV is hoping of having a CUDA effect where they'd stick to use NV's RTX extensions which wouldn't affect console games, but like we've seen for years with GameWorks throw huge monkey wrenches into AMD's product performance.

Just NV playing their usual games, and it's clearly working.

EDIT: typos galore!

It's way too early to argue that Nvidia's implementation of raytracing will become the standard just like how CUDA become the standard for GPGPU programming. CUDA for years had no effective response against it unlike raytracing which will almost certainly see a competing implementation against it in less than 2 years since Nvidia's introduction ...

Also each vendors likely have very different HW implementations of raytracing as well under the hood so don't get the idea that either AMD or Intel will follow the DXR specifications as is without any of their own unique traits. Based off of AMD's patent alone they do hardware accelerated intersection testing in the TMUs and handle the BVH traversal in the shaders. Intel looks to be seriously pushing the idea of programmable traversal shaders from one of their research papers so it's very likely that they're customizing their shader/ISA design to handle this as efficiently as possible. As for Nvidia from extensive analysis, their RT cores do fixed function BVH traversal while the ray intersection tests are done on the tensor/shader cores ...

Intel's possible implementation vs NV's current implementation look to be extremely polar opposites of each other with wildly different performance characteristics depending on the workload design ...

Each vendor's design comes with their own set of strengths/limitations and by no means is Nvidia's implementation is impervious like you seem to imply it is. With AMD's RT implementation, doing customized intersection test shader programs would be a very bad idea since they can't use the intersection units built into their TMUs. Intel's traversal shader concept is good for reducing bandwidth consumption but there is some overhead involved with the additionally generated shader invocations. With Nvidia, ray traversal is totally fixed function so traversal shaders would end up being a bad idea on their hardware since they have to emulate this without being able to use their RT cores ...

Come DXR 2.0 specification and Microsoft chooses to standardize Intel's traversal shaders, it could very well end badly for Nvidia because then they'd be forced to significantly reachitect their HW designs for ray tracing compared to Turing or face a huge performance cliff if god forbid developers decide to write custom ray traversal shader programs ...
 

beginner99

Diamond Member
Jun 2, 2009
5,320
1,768
136
I really hope Jensen is watching and they will cut the crap with Ampere. I want 10tflops, 12GBs at 400$ or they will get the D! Ideally I would like the same from big Navi as well.

So true. NV could make any AMD card invalid with Ampere. I mean AMD can barley manage to keep up with performance/$ with a node advantage. Once NV is on 7nm(+), they could force AMD into making extreme price cuts.
 

uzzi38

Platinum Member
Oct 16, 2019
2,747
6,657
146
So true. NV could make any AMD card invalid with Ampere. I mean AMD can barley manage to keep up with performance/$ with a node advantage. Once NV is on 7nm(+), they could force AMD into making extreme price cuts.
By the time Nvidia can, AMD will be prepping 5nm RDNA3.

Name of the game for AMD is to stay a node ahead now, and thankfully they can do so now.

Thank you roadmap slip-ups.
 
  • Like
Reactions: Glo.

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
Realtime Raytracing changed the game. nVidia forced Microsoft and Sony to adapt hardware support instead of producing just hardware upscaler. At the same time more and more (indie) developer will use Raytracing on the PC to increase quality and reduce production costs/time.

With Raytracing comes the need of DL upscaling and denoiser. Another area where nVidia is leading the industry with their hardware and software.

Basically we are at the iPhone 2007 timeline.
I'm not gonna ruin a wonderful Sunday with replying to ignorance. Instead I'll just choose to ignore you.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
But then again, if you believe that, you probably also believe Jensen when he says a 2080 Max-Q will be more powerful than upcoming consoles, which is an utter joke.

Jensen knows his audience. Hence the 'Pascal 10x Maxwell ZOMG!!!!!!!!!111'.

Short. Simple. Strong. Triumphant.

I'm guessing he and Trump attended the same marketing class before aspiring for presidency - albeit in different industries :)
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
So true. NV could make any AMD card invalid with Ampere.
At least insert a 'probably' or an 'I think', please.

Otherwise I must ask, how the fudge do you suppose to know that? (And by that I don't mean how the fudge do you guess.)
 
Status
Not open for further replies.