[Rumor, Tweaktown] AMD to launch next-gen Navi graphics cards at E3

Page 40 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

nurturedhate

Golden Member
Aug 27, 2011
1,745
677
136
Thing is , it was a 280 replacement not 290X.They just didnt have hawaii replacement as it got canned.And for the 280 its pretty much double perf/watt.
He's not even comparing it to a 290x, that's a 270x of all things.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
When ATI was smashing Nvidia at every metric when HD 5xxx just came around and Nvidia was still on GTX/S 2XX, the marketshare at most stayed 50-50. THAT is the effect of mindshare working at it's fullest.
I remember that time - I got a GTX 260 216, it was really cheap, it worked great and it drove my 120hz monitor at 120hz (something the AMD cards didn't do at the time). I do admit I did occasionally take off my tinfoil hat...
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I remember that time - I got a GTX 260 216, it was really cheap, it worked great and it drove my 120hz monitor at 120hz (something the AMD cards didn't do at the time). I do admit I did occasionally take off my tinfoil hat...

Back then it required a dual link DVI cable. Which only certain cards for both makers could do. It was typically the upper end cards from each. The HD 5800 series could do it for instance.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
He's not even comparing it to a 290x, that's a 270x of all things.
It didn't replace the 290X (against which its perf/watt advantage is 64%), though. AMD itself compared it to the 270X (or at the very least they compared Polaris 10 to it), and the die sizes are the same (232 and 212 mm^2 respectively). It didn't replace the 280X either. It was always at that "mainstream" ~$200 price point that Pitcairn chips filled. It's the correct comparison.

It is true that it was about twice the perf/watt of the 280X, but compared to some 28nm designs, it was less, and the 270X is the most apt comparison.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136

Perf/watt should always be measured at iso performance or iso watt. This is why we compare RX480 perf/watt to R9 290X, they both have the same perf.

https://www.techpowerup.com/reviews/AMD/RX_480/

perfrel_1920_1080.png
 
  • Like
Reactions: lightmanek

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Back then it required a dual link DVI cable. Which only certain cards for both makers could do. It was typically the upper end cards from each. The HD 5800 series could do it for instance.
There were plenty of complaints from AMD users of 5800's at the time. In those days it was 3D vision, an Nvidia technology, that ushered in those 120hz displays (they were all 3D vision compatible). AMD simply didn't design that range of cards with 120hz in mind so it took them a while to sort it out. Anyway my original point being that it wasn't quite as simple as people only going Nvidia because they didn't know better.
 
Last edited:

lifeblood

Senior member
Oct 17, 2001
999
88
91
I read an article by PCGamesN that custom cards will follow the release of Navi by 6 weeks. Anyone read any rumors as to whether the reference cards will be open fans or blower design? Although blowers have a place, its place is not inside my computer.

I wonder if the reason AMD’s reference cooler is a blower is meant to not step on AMD’s partners toes? That way the partners are free to design their coolers the way they want without any real competition from AMD. It makes sense but it forces first adapters to get blowers.

Having said that, given how long AMD has had to get ready for this I hope we get semi-custom cards (reference PCB with custom coolers) on day one.
 

DrMrLordX

Lifer
Apr 27, 2000
21,648
10,870
136
Radeon VII didn't use a reference blower. That being said, there was some question as to whether there would be custom Radeon VII cards. AMD never said anything more than, "well it's possible". Lo and behold, there are none.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Perf/watt should always be measured at iso performance or iso watt. This is why we compare RX480 perf/watt to R9 290X, they both have the same perf.

https://www.techpowerup.com/reviews/AMD/RX_480/


This is an illogical way to measure a card's improvement in perf/w in so many ways.
1. New halo cards from Nvidia have no equal in performance, and therefore based on your logical have an infinite improvement in performance per watt.
2. The GTX 750 TI matched the GTX 480 in performance, but in no way was replacement to that card to anyone with a half brain.

Similar die sizes or same family hierarchy is the most logical way to measure perf/w improvements. The RX480 slotted the exact same way the 270x was with relation to price, die size, and performance amongst the rest of it's stack so therefore that is the most natural and logical comparison.
 

Glo.

Diamond Member
Apr 25, 2015
5,722
4,579
136
This is an illogical way to measure a card's improvement in perf/w in so many ways.
1. New halo cards from Nvidia have no equal in performance, and therefore based on your logical have an infinite improvement in performance per watt.
2. The GTX 750 TI matched the GTX 480 in performance, but in no way was replacement to that card to anyone with a half brain.

Similar die sizes or same family hierarchy is the most logical way to measure perf/w improvements. The RX480 slotted the exact same way the 270x was with relation to price, die size, and performance amongst the rest of it's stack so therefore that is the most natural and logical comparison.
it makes no sense only if you make any assumptions about GPUs performance, based on performance/watt ratios.

People told time and time again, that TPUs numbers are mostly wrong, apart from Power Consumption ones.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
This is an illogical way to measure a card's improvement in perf/w in so many ways.
1. New halo cards from Nvidia have no equal in performance, and therefore based on your logical have an infinite improvement in performance per watt.
2. The GTX 750 TI matched the GTX 480 in performance, but in no way was replacement to that card to anyone with a half brain.

Similar die sizes or same family hierarchy is the most logical way to measure perf/w improvements. The RX480 slotted the exact same way the 270x was with relation to price, die size, and performance amongst the rest of it's stack so therefore that is the most natural and logical comparison.

1. We are talking improvements of performance for a given amount of power usage. The card would be measured based on wattage. So it would be compared to a previous gen card that used the same amount of power, and then show the difference in performance. If it uses more power and has more performance, the calculation is not so simple. But by no means would it be considered infinitely better.

2. This doesn't make much sense in this conversation. Who here said we should compare cards that are 3 generations apart?

You note similar die sizes, thing is, the 270X and the RX 480 do not have the same transistor counts at all as they are on different nodes. The RX 480 has double the transistors of the 270X and is much closer to Hawaii. Which makes sense as they have near the same performance, and Hawaii production ended because the RX 480 took its place. You mention they cannot be compared because of pricing. But pricing changes, and to this day, there has never been a replacement for Hawaii. We had Polaris, which matched its performance. And then a year later we had Vega, but its pricing was way above what the 390 was priced at.

Not even sure why this has to be argued. When AMD stated twice the Perf:Watt, they specifically mentioned the 390. As they had the same performance envelope, but Polaris used almost half the power.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
You note similar die sizes, thing is, the 270X and the RX 480 do not have the same transistor counts at all as they are on different nodes. The RX 480 has double the transistors of the 270X and is much closer to Hawaii. Which makes sense as they have near the same performance, and Hawaii production ended because the RX 480 took its place. You mention they cannot be compared because of pricing. But pricing changes, and to this day, there has never been a replacement for Hawaii. We had Polaris, which matched its performance. And then a year later we had Vega, but its pricing was way above what the 390 was priced at.

Not even sure why this has to be argued. When AMD stated twice the Perf:Watt, they specifically mentioned the 390. As they had the same performance envelope, but Polaris used almost half the power.
AMD itself compared Polaris 10 to 270X, from which they came up with their claimed 2.8x performance/watt increase, it has the same die size, it fills the same market segment, targets similar power consumption; it's the right comparison. The only problem with AMD's comparison is that they used TDP; they didn't actually measure the power consumption of the card under load. It's hard to excuse that when review outlets are able to do it. Of course, this works out in their favor because 270X had a 180W TDP, though the card itself it only consumes 110W at full load. They said the RX 470 had a 110W TDP (later changed to 120W) and consumes 120W. There's a whole lot wrong with this - the 270X consumes 10W less than 470, not 70W more, and once again, AMD is the one who made this comparison. I linked the relevant slide in my post above.

As for the 390, well, the RX 480's performance/watt advantage is only 70%; it gets a bit better with the reference 470 (87% advantage), though aftermarket 470s are less efficient, but that's still not double, and it's certainly not 2.8x, which was the number AMD showed us most prominently.

I know it's kind of lame to relitigate this nearly three years later, but the fact is, Polaris was about 70% more efficient on average than the previous gen, at best about 100% (compared to the 280-class cards) and at worst about 35%; AMD's claimed 180% increase was most charitably an enormous stretch and less leniently considered, an outright lie, and it's not something they've given up on or corrected. Their use of a 6-pin power connector on the RX 480 was also a terrible idea; they wanted to have a smaller connector than the 970, so they actually compromised the actual operation of their product (mildly perhaps, but it absolutely would've been way more stable with an 8-pin) just so they could say that, even though it means nothing - the 970 consumes actually slightly less power than the 480.

I just don't like any of what they did - at every turn they tried to hide what Polaris actually was from us. It wasn't honest and it wasn't consumer-friendly. All they had to do was say it had 1.7x better efficiency (or they could even say 2x if they compared it to the 280X explicitly) and put an 8-pin connector on the 480 and everything would be perfectly fine; the RX 470 was actually quite great, but instead they lied to us. I'm not going to forget that in a hurry.

The closest you can get is the reference 470 vs the reference 280X - a 2.15x improvement, but still not 2.8x, and not made by AMD, and not the right comparison - neither by performance (390) and neither by market segment/die size/power target (270X).
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
1. We are talking improvements of performance for a given amount of power usage. The card would be measured based on wattage. So it would be compared to a previous gen card that used the same amount of power, and then show the difference in performance. If it uses more power and has more performance, the calculation is not so simple. But by no means would it be considered infinitely better.

2. This doesn't make much sense in this conversation. Who here said we should compare cards that are 3 generations apart?

You note similar die sizes, thing is, the 270X and the RX 480 do not have the same transistor counts at all as they are on different nodes. The RX 480 has double the transistors of the 270X and is much closer to Hawaii. Which makes sense as they have near the same performance, and Hawaii production ended because the RX 480 took its place. You mention they cannot be compared because of pricing. But pricing changes, and to this day, there has never been a replacement for Hawaii. We had Polaris, which matched its performance. And then a year later we had Vega, but its pricing was way above what the 390 was priced at.

Not even sure why this has to be argued. When AMD stated twice the Perf:Watt, they specifically mentioned the 390. As they had the same performance envelope, but Polaris used almost half the power.

LOL 3 generations apart and only a 35% perf/w improvement. Thank you for hitting the nail on the head even more than I did.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
I think a lot of you are ignoring that in a lot of countries 1050ti was actually a lot cheaper than RX 570. We are talking about as much as $50-70 difference. This was the main reason for 1050ti being so popular because it was cheaper and small and required no additional power. Only in the last few months has 570 reduced the price gap although now the competition is with the 1650 and guess what the 1650 is also about $30 cheaper.
 

Veradun

Senior member
Jul 29, 2016
564
780
136
Vega 64 isnt faster than 1080 in almost all the games, it isnt even faster on average.
https://www.techpowerup.com/gpu-specs/radeon-rx-vega-64.c2871
Thats why its below 1080 in this graph, and it was even worse at launch.
BUT , its certainly faster in most newer games, thats why you may get this.
AND moreover custom aib vega 64`s in newer games pretty much trade punches with vanilla 2070.But custom ones have better cooling so they dont throttle as much.You can still see a result where v64 gets lower fps than 1070Ti.
You know TPU performance summary includes ancient games and stupid games nobody sane would test GPUs on, like Civ VI, or nobody would ever buy like DragonQuest? Laughable comparison. Use something with some credibility.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
You know TPU performance summary includes ancient games and stupid games nobody sane would test GPUs on, like Civ VI, or nobody would ever buy like DragonQuest? Laughable comparison. Use something with some credibility.
They tested 20 games for the Vega 64 review; one of them was Civ 6, none of them were Dragon Quest. Incidentally, TPU tests Strange Brigade these days as well.

Also, in my opinion, Techpowerup is one of the best review outlets and websites. They test lots of games, they're very even handed with the choices, including games that do especially well on each vendor's cards, they give very useful power consumption measurements directly from the pins, temperature and noise tests, average clock frequency tests (extremely useful), and they maintain a database of pretty much every GPU full of good information about them. Their normalized perf/dollar, perf/watt and performance charts are very easy to digest and useful. They also maintain an extensive review database categorized by each card. Saying they have "no credibility" because they tested one game you don't like? Come on.

And here's another source:
vega64_computerbase.png
https://www.computerbase.de/2017-08/radeon-rx-vega-64-56-test/4/

Winning by 2% with the max power target (incidentally the GTX 1080 on max is 8% faster than it is with the normal power target) over the 1080 FE is about as good it's going to get. Even AMD knew Vega was slower than the 1080; why else would they market it as "you can't tell the difference between them"?
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
21,648
10,870
136
You know TPU performance summary includes ancient games and stupid games nobody sane would test GPUs on, like Civ VI, or nobody would ever buy like DragonQuest? Laughable comparison. Use something with some credibility.

Boo, I bought DQ XI for full price and got well over a hundred hours out of it. Someday I might go back and play it again on my Radeon VII. Great game for anyone who is a DQ fan.
 
  • Like
Reactions: soresu

lifeblood

Senior member
Oct 17, 2001
999
88
91
Figuring out which card replaced which card was so much easier when they had consistent naming. The HD 6750 replaced the HD 5750. Simple. Then they started giving weird names and made it much more difficult.
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,483
2,352
136
Figuring out which card replaced which card was so much easier when they had consistent naming. The HD 6750 replaced the HD 5750. Simple. Then they started giving weird names and made it much more difficult.
Easiest way right now is to use price brackets. If 399/499 prices are real, then NAVI is VEGA replacement.
 
  • Like
Reactions: Ajay

lifeblood

Senior member
Oct 17, 2001
999
88
91
Easiest way right now is to use price brackets. If 399/499 prices are real, then NAVI is VEGA replacement.
I'm expecting Navi 10 to be the Vega replacement, Navi 14 to be the Polaris replacement, and Navi 20 (in 2020) to be the Radeon VII replacement. How it actually works out, we'll see...
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
Easiest way right now is to use price brackets. If 399/499 prices are real, then NAVI is VEGA replacement.
Based on your price measurement (which I agree with), the predecessors for RX 480 were:

RX 480 4GB ($199) < R9 380 ($199) < R9 270X ($199)
RX 480 8GB ($239) < R9 380X ($229) < R9 280 ($249)

These prices are MSRP, curtesy of Wikipedia, so the actual street price will vary, especially outside of the US. The RX 480 8GB prices are not a perfect match so take it for what it’s worth.

What’s absolutely clear is that by this measure Navi 10 at either $399 or $499 is absolutely not a replacement for Polaris.
 
  • Like
Reactions: Head1985

lifeblood

Senior member
Oct 17, 2001
999
88
91
I think a lot of you are ignoring that in a lot of countries 1050ti was actually a lot cheaper than RX 570. We are talking about as much as $50-70 difference. This was the main reason for 1050ti being so popular because it was cheaper and small and required no additional power. Only in the last few months has 570 reduced the price gap although now the competition is with the 1650 and guess what the 1650 is also about $30 cheaper.
Comparing the 1050ti sales to RX 570 is kinda useless. Until recently the MSRP was irrelevant as the prices were sky high due to mining and it was hard to even find an RX 570 (I know because I looked). The 1050ti was expensive but at least it was available. When I went to Japan I looked in a few electronics stores there. All the stores had a few 1050ti’s on the shelf but the spot where the 570 went was empty.
 
  • Like
Reactions: Head1985

Glo.

Diamond Member
Apr 25, 2015
5,722
4,579
136
I think a lot of you are ignoring that in a lot of countries 1050ti was actually a lot cheaper than RX 570. We are talking about as much as $50-70 difference. This was the main reason for 1050ti being so popular because it was cheaper and small and required no additional power. Only in the last few months has 570 reduced the price gap although now the competition is with the 1650 and guess what the 1650 is also about $30 cheaper.
In majority of countries like US, Canada, Australia, Japan, China, whole EU, etc RX 570 was for months cheaper than GTX 1050 Ti. At least, for last 8 months. And for last 8 months GTX 1050 Ti still outsold RX 570.
 
  • Like
Reactions: VirtualLarry

railven

Diamond Member
Mar 25, 2010
6,604
561
126
You know TPU performance summary includes ancient games and stupid games nobody sane would test GPUs on, like Civ VI, or nobody would ever buy like DragonQuest? Laughable comparison. Use something with some credibility.

I wish more sites would review ports of popular series. The console exclusivity reign is OVER! While I doubt any current PC-specific site would cater to us, I'm sure there is a market for sites to test these games. Some launch broke as hell!

Boo, I bought DQ XI for full price and got well over a hundred hours out of it. Someday I might go back and play it again on my Radeon VII. Great game for anyone who is a DQ fan.

But you're a nobody! So you don't count!
 
Status
Not open for further replies.