[Rumor, Tweaktown] AMD to launch next-gen Navi graphics cards at E3

Page 106 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
Let me give you some statistics about how Overwatch behaves, with different configs, settings and presets, in the context of RX 5500, and what AMD released today in their announcement.. This is a game that I can tell you a Lot about, because I play a lot, tested a lot, and watched a lot of videos, of experiments to see if there are any performance patterns.

So here goes.

GTX 1660 Ti with Core i5 9400F, in 1080p, Epic preset averages 155 FPS, with 9700F averages 156 FPS in those settings, sometimes hitting 157 FPS. There is no difference above this, when it goes for Intel CPUs.
With R5 3600X it averages 140 FPS, with R7 3800X it averages 142 FPS, with 2700X(!) it averages 138 FPS, and this. All of those scores are based on data from different maps, different situations, etc. On AVERAGE this is what you should see in Overwatch with this GPU, and CPU combos.
GTX 1660 averages with 9400F 142 FPS in this game, settings, and preset, with 9700F it averages 144 FPS, sometimes hitting 145 FPS. The same situation, you go over 9700F in performance - fully GPU bound, and no difference between CPUs. With R5 3600X it averages 130 FPS, with 3800X it averages 132 FPS, with R7 2700X it averages 126 FPS.

Those are tests of Overwatch that me and my friend has done on those CPUs around late August, data also comes from review sites that test Overwatch, like Hardware Canucks on those GPUs, as a point of validation(Canucks got exactly 144 FPS for GTX 1660, and 157 FPS average for GTX 1660 Ti, in 1080p, Epic presets, with 9900K). All of the YouTube channels that tested those GPUs and CPUs in those configs got similar results.

If we go by this data, 135 FPS with 3800X, that AMD has done, in their testing puts it exactly between GTX 1660 and 1660 Ti, with this GPU being actually faster(in this game), and with Intel CPU we should expect 15 FPS more, with up to 150 FPS, on average.

What were the RAM timings? Was the 3600X in the 1660Ti rig running PBO or stock? Where did you find the data on endnote RX-383? I thought they ran those 5500 numbers on a 3800X.
Some Japanese site claimed that the tests were done with 3600X and 3800X. If it is with 3800X - whatever, considering the data, from this post.
 

Elfear

Diamond Member
May 30, 2004
7,169
829
126
Ahh my bad, somehow i missed that

Thanks

edit: Ok now im confused, if we use Average performance why we use the Up-To as well, its not needed correct ??

Not sure of needed, but confusing for sure

Agreed. They worded that very weird. Probably trying to cover their butts in case anyone found that the average increase was less than 37%. Much like the "up to" boost clocks. :rolleyes:
 

soresu

Diamond Member
Dec 19, 2014
4,244
3,748
136
Surprised no one noticed AMD's rather obvious faux pas in their slide notes:

Footnote 6:
"Testing done by AMD performance labs on August 29, 2019. Systems tested were: Radeon RX 5500 4GB with Ryzen 7 3800X. 16GB DDR4-3200MHz Win10 Pro x64 18362.175. AMD Driver Version 19.30-190812n Vs Radeon RX 480 8GB with Core i7-5960X (3.0GHz) 16GB DDR4-2666 MHz Win10 14393 AMD Driver version 16.10.1 The RX 5500 graphics card provides 1.6x performance per watt, and up to 1.7X performance per area compared to Radeon™ RX 480 graphics. PC manufacturers may vary configurations yielding different results. Actual performance may vary. RX-382"

The RX 480 was tested with a driver that came less than 6 months after the cards release (16.10.1).

Who wants to bet how well the 480 performs with some of those games compared to a modern driver?

At the very least using such old drivers compromises a fair perf/watt test comparison with the 5500.

I want to like AMD, but stuff like this doesn't do them any favors.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
Assuming 30% faster on avg than 1650, and considering almost everyone agrees 1650 was very overpriced at 149 dollars, RX5500 needs to at most come in at 149. 5500XT assuming matches the 1660Ti and comes with 8gb vram, the 1660ti is about 250 dollars so 250 would be an ok price for 5500XT but again not really disruptive.
 
  • Like
Reactions: VirtualLarry

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Assuming 30% faster on avg than 1650, and considering almost everyone agrees 1650 was very overpriced at 149 dollars, RX5500 needs to at most come in at 149. 5500XT assuming matches the 1660Ti and comes with 8gb vram, the 1660ti is about 250 dollars so 250 would be an ok price for 5500XT but again not really disruptive.

Ehm, if 5500 is 30% faster than 1650 then $149 will be a steal.
But since rumors suggesting that NVIDIA preparing to launch the 1650Ti , im guessing that 5500 and 1650Ti will be more expensive than $149 and closer to $189-$199.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
Ehm, if 5500 is 30% faster than 1650 then $149 will be a steal.
But since rumors suggesting that NVIDIA preparing to launch the 1650Ti , im guessing that 5500 and 1650Ti will be more expensive than $149 and closer to $189-$199.

149 won't be a steal because 149 is terrible price for 1650 and everyone heavily criticised the 1650 for offering very poor value for money. It was mentioned that 1650 should at most have cost 120. So if RX 5500 will have 30% better performance then 120+30% =156 so 149 should be the maximum price if amd doesn't want to receive the same treatment that 1650 got.
I do not see any shred of logic behind your 189-199 price as that would make the 5500 less value for money than the 1650 effectively making it the worst amd graphics product in recent history. 199 probably for the 1660ti competitor, not the 1650ti one.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
149 won't be a steal because 149 is terrible price for 1650 and everyone heavily criticised the 1650 for offering very poor value for money. It was mentioned that 1650 should at most have cost 120. So if RX 5500 will have 30% better performance then 120+30% =156 so 149 should be the maximum price if amd doesn't want to receive the same treatment that 1650 got.
I do not see any shred of logic behind your 189-199 price as that would make the 5500 less value for money than the 1650 effectively making it the worst amd graphics product in recent history. 199 probably for the 1660ti competitor, not the 1650ti one.

Im with you about GTX1650 MSRP of $149 is way high but,

AMD will price its cards against the competition and if currently GTX1650 is at $149 and RX5500 is 30% faster, then there is no way in hell AMD will launch RX5500 at $149 MSRP.
Perhaps the $189-$199 prices are a bit off but It all depends on GTX1650Ti MSRP price, if NVDIA will launch it at $189 then AMD will price the R5500 closer to $169.

Edit. The only way for the RX5500 to launch at $149 will be if GTX1650Ti will launch at $159 and GTX1650 have a price cut down to $129
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
Im with you about GTX1650 MSRP of $149 is way high but,

AMD will price its cards against the competition and if currently GTX1650 is at $149 and RX5500 is 30% faster, then there is no way in hell AMD will launch RX5500 at $149 MSRP.
Perhaps the $189-$199 prices are a bit off but It all depends on GTX1650Ti MSRP price, if NVDIA will launch it at $189 then AMD will price the R5500 closer to $169.

Edit. The only way for the RX5500 to launch at $149 will be if GTX1650Ti will launch at $159 and GTX1650 have a price cut down to $129
That's the problem right now in the graphics card industry, AMD is not willing to do the same thing they did With intel and consumers suffer as a result.
 

Trumpstyle

Member
Jul 18, 2015
76
27
91
Okey now that Navi 14 is released we can make very accurate guesses on the new cards, this is my list:

Navi apu = 12CU 7nm (no 7nm+)

Navi 10 = 40CU 7nm
Navi 14 = 24CU 7nm
Navi 12 = 80CU HBM2e (will be no gddr6 cards)

Navi 21 = 56CU 7nm+ RDNA2
Navi 23 = 112CU 7nm+ RDNA2

Everything points to Navi 12 being 80CU and not 56CU's I as first thought, we know about the rumored geforce 2080 ti SUPER which will counter Navi 12.

BONUS:
PS5 44CU, 1,8ghz (10,1TF), 16GB Vram 512-576 GB/s Memory bandwidth (12GB Vram games, 4GB OS)
Xbox next-gen 44CU, 1,8ghz (10,1TF), 14GB Vram 320-bit bus 560 GB/s memory speed (10GB Vram games, 4GB OS)
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,227
126
Last edited:
  • Haha
Reactions: Mopetar and Stuka87

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
Surprised no one noticed AMD's rather obvious faux pas in their slide notes:

Footnote 6:
"Testing done by AMD performance labs on August 29, 2019. Systems tested were: Radeon RX 5500 4GB with Ryzen 7 3800X. 16GB DDR4-3200MHz Win10 Pro x64 18362.175. AMD Driver Version 19.30-190812n Vs Radeon RX 480 8GB with Core i7-5960X (3.0GHz) 16GB DDR4-2666 MHz Win10 14393 AMD Driver version 16.10.1 The RX 5500 graphics card provides 1.6x performance per watt, and up to 1.7X performance per area compared to Radeon™ RX 480 graphics. PC manufacturers may vary configurations yielding different results. Actual performance may vary. RX-382"

The RX 480 was tested with a driver that came less than 6 months after the cards release (16.10.1).

Who wants to bet how well the 480 performs with some of those games compared to a modern driver?

At the very least using such old drivers compromises a fair perf/watt test comparison with the 5500.

I want to like AMD, but stuff like this doesn't do them any favors.
They just pulled their OLD results. Nothing more. Why do you care about RX 480, when they compared RX 5500 to GTX 1650, and shown actual performance of the Nvidia GPU?
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
Assuming 30% faster on avg than 1650, and considering almost everyone agrees 1650 was very overpriced at 149 dollars, RX5500 needs to at most come in at 149. 5500XT assuming matches the 1660Ti and comes with 8gb vram, the 1660ti is about 250 dollars so 250 would be an ok price for 5500XT but again not really disruptive.
So you want a GPU that performs between GTX 1660 and 1660 Ti to cost 150$. 1/3 less than their competition.

Where is the business logic anywhere here?

P.S. I would also love it to cost 150$, but lets be REALISTIC...
 

soresu

Diamond Member
Dec 19, 2014
4,244
3,748
136
They just pulled their OLD results. Nothing more. Why do you care about RX 480, when they compared RX 5500 to GTX 1650, and shown actual performance of the Nvidia GPU?
Because it's laughably dishonest to make a direct per/watt comparison to a 3 1/4 year old card and use such old drivers, especially if the games they compare against are more recent - ie not optimised for in 16.10.1.

They did not make a blatant perf/watt comparison with the 1650 on the slides, they made it with 480 - that is the issue in play here.

This is coming from someone who has bought only AMD since 2008 - I want them to be worthy, and it looks like they are trying, but acting like this just does their credibility no favors.

On top of such bad advertised TDP's I'm giving this gen a miss, all I want is a card which blows the crap out of a 480 but needs no more than 75W (connector) to do so - is that so much to ask for after over 3 frickin years and an entire process shrink (or more than).

Even a direct shrink of Polaris 10 should have yielded greater efficiency than this, both in area and power consumption.

I simply don't understand why their raw perf/mhz/mm2 sucks so badly - it's like they have actually been going continually backwards in area utilisation since GCN3 rather than shrinking it.

Not everyone is looking for cards that are volted to insanity off the shelf, I want something that is efficient off the shelf, my 7870 did that.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
They just pulled their OLD results. Nothing more. Why do you care about RX 480, when they compared RX 5500 to GTX 1650, and shown actual performance of the Nvidia GPU?
If you want to make AMD better then got to call them out on stupid not defend it. Perhaps then they'll learn and do a better job next time. Surely they have a 480 lying around - it would have taken very little effort to swap the cards in the PC and rerun the tests.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
If you want to make AMD better then got to call them out on stupid not defend it. Perhaps then they'll learn and do a better job next time. Surely they have a 480 lying around - it would have taken very little effort to swap the cards in the PC and rerun the tests.
You believe they also have 5th gen HEDT CPU laying around?

They just pulled old numbers for RX 480.
 

Bouowmx

Golden Member
Nov 13, 2016
1,150
553
146
When is the Radeon RX 5500 desktop series coming out? I got lost in all the bad communication.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Even a direct shrink of Polaris 10 should have yielded greater efficiency than this, both in area and power consumption.

Because they are trying to squeeze all the juice they can get out of each chip. In that case they are trying to get more performance from a cut down die (NAVI 14).
Personally I believe this approach is a mistake, they would make less money by launching a more efficient product with less overall performance but they would make huge gains in users mind share.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
When is the Radeon RX 5500 desktop series coming out? I got lost in all the bad communication.

All we know today is just Q4 2019. But if NVIDIA will launch the GTX1650Ti at the end of the month, im guessing AMD will launch the RX5500 very close to that.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
When is the Radeon RX 5500 desktop series coming out? I got lost in all the bad communication.
All we know today is just Q4 2019. But if NVIDIA will launch the GTX1650Ti at the end of the month, im guessing AMD will launch the RX5500 very close to that.
OEM computers with RX 5500 are coming November 2019. When the desktop cards - not announced yet.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
Even a direct shrink of Polaris 10 should have yielded greater efficiency than this, both in area and power consumption.
I would not go that far. Based on what we know about 7 nm process, it should yield 2x the density over 14 nm process.

And yet, AMD was able to achieve ONLY 60% better density. Yes, Apple was able to squeeze from low-power process very good density. But anything high performance/high power is making this process to fell off the cliff in every single performance metric: density, performance, power consumption.

EUV is supposed to be better on this front. Much better, but that remains to be confirmed, as always.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Has anybody actually seen anything mentioned regarding AIB cards? All I have seen is references to OEM applications, none that are stand alone.
 

soresu

Diamond Member
Dec 19, 2014
4,244
3,748
136
with less overall performance
I think that is a relative term to be honest, I'd rather have max perf/watt, and AMD seems to have all but completely abandoned giving that to people off the shelf.

It's disappointing that wont just make an SKU for perf/watt and leave the max juice gamer cards for everyone else - like a 5500E to match their 2700E CPU.
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
You believe they also have 5th gen HEDT CPU laying around?
No, but they probably have an RX480 they could have tossed into the 5500 system. Why are you making it sound like it would have been hard to compare an RX480 to a 5500 on the same system?

That being said....

They just pulled old numbers for RX 480.
Why are they even bothering comparing a 5500 to the RX480 instead of an RX570 or RX580 for this particular test, performance/watt? What are they even thinking? I don't think ANYONE cares how this card performs against a card that is 3 years (and 3 generations now) old.

They could have easily run an honest test against two-generation-old (instead of three-generation-old) GPUs by slapping an RX570 into the same system they tested the 5500 in. Or against a one-generation old GPU by running the performance/watt against a Vega.

Them including an RX480 just doesn't make any sense.
 
Last edited:
Status
Not open for further replies.