Polaris 10 and 11 confirmed to be GDDR5 based

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
More efficiency means that when it comes time to stuff a chip full of transistors till 300W comes out they can stuff more transistors in there. Power and performance are two sides of the same coin.

very well said. perf/watt is the determining factor for how much perf you can stuff into a high end flagship GPU which typically draws 250-300w. perf/sq mm determines the GPU die size required to provide that perf. The goal of any GPU architecture should be to maximize perf/watt and perf/sqmm.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
290X is 3 years old, or rather it will be by the time polaris 10 launches. It would be foolish to think that they havent done something to greatly improve performance per transistor in that time. So even though the raw transistor count wont be changing much, we're still talking 3 years of progress towards improvment. Especially when you consider the fact that when 290X launched, it basically brought no real per transistor improvements. So really we are looking at the potential closing of a 5 year gap where AMD performance per transistor simply hasnt gone up. It should handily beat a 390X.

If AMD can deliver the same architectural surge they did with the HD4000 series, then it will match a 980ti. Given the fact that its been basically five years, this isnt out of the realm of possibility.

exactly. AMD's GCN architecture has not changed for roughly 4.5 years inspite of slight enhancements like color compression (Tonga). The architectural bottlenecks was one of the reasons that Fiji encountered significant perf scaling problems wrt smaller GPUs like Tahiti and Hawaii. Hawaii did not improve perf/transistor wrt Tahiti and perf/transistor actually regressed with Fiji wrt Hawaii. Polaris is a significant redesign to address many of the bottlenecks and inefficiencies in the current GCN architecture. So it would not be surprising if we see Polaris 10 match 980 Ti/Fury X as its basically coming after 2 GPU generations as the average GPU generation lasted 24 months.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Sucks to hear that Polaris 10 & 11 will be GDDR5, but that means we won't have to eat the cost of HBM. If I can get 2.5 to 3x the performance of my R9 270 in the same kind of overall package for under $200, I think I'll be happy.

As cool as 4K would be, right now, I think it's just best for me to focus on "perfecting" my 1080p experience (60+ FPS maxed out). However, that might mean going even higher up on the graphics card totem pole......
 

MrTeal

Diamond Member
Dec 7, 2003
3,911
2,677
136
Sucks to hear that Polaris 10 & 11 will be GDDR5, but that means we won't have to eat the cost of HBM. If I can get 2.5 to 3x the performance of my R9 270 in the same kind of overall package for under $200, I think I'll be happy.

As cool as 4K would be, right now, I think it's just best for me to focus on "perfecting" my 1080p experience (60+ FPS maxed out). However, that might mean going even higher up on the graphics card totem pole......

I would say there's absolutely 0 chance of that happening. 2.5 - 3 times the performance of your 270 would be more than a 390X, and we're not getting that for $200 any time soon.
 

Snarf Snarf

Senior member
Feb 19, 2015
399
327
136
I would say there's absolutely 0 chance of that happening. 2.5 - 3 times the performance of your 270 would be more than a 390X, and we're not getting that for $200 any time soon.

It's hard to say where a cut down Polaris 10 would slide in the pricing scheme. If you believe what Raja and Huddy have been saying is that they are going straight for the VR recommended specs at a lower price point, it's reasonable to think they could get 390x at around $250.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
I think $250 is absolutely doable for a 4 GB product, considering the 14nm die size savings, the smaller memory bus (if it really is 256 or 192 bit), maturity of dense GDDR5 modules and less substantial PCB, power, and cooling hardware than would be needed for an R9 390/R9 390X graphics card.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
It is, but it likely won't happen until Vega hits.

Then this stuff will be the entry level cards, and there's a limit on how much can they charge for those!
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
That would be amazing.

Hopefully you are being sarcastic? Considering hot deals for 290s and 970s have hit this mark. If you couldn't get 390x performance for at or around that next gen, it will be a total failure.
 

MrTeal

Diamond Member
Dec 7, 2003
3,911
2,677
136
I think $250 is absolutely doable for a 4 GB product, considering the 14nm die size savings, the smaller memory bus (if it really is 256 or 192 bit), maturity of dense GDDR5 modules and less substantial PCB, power, and cooling hardware than would be needed for an R9 390/R9 390X graphics card.

A 390X is still a $400 card even after MIRs that consumes a good bit of power and is quite large. A 14nm version with similar performance at launch with consume much less power, be smaller, include much better codec support and HDMI2.0, etc, etc. It would also include all the architectural advantages that will create and widen a performance gap over time. Nothing but advantages. You're not going to get all that and a $200 price drop. Even $250 isn't likely to happen.

Case in point, when the 7850 and 7870 were launched they were about 5-10% faster than the outgoing 6950 and 6970. They were also $10 more expensive. When the 6970 first launched, it was a good 20% or so faster than the 5870, but it launched at $370 at a time when you could pick up a 5870 for $250. In recent history, you don't see a big jump in perf/$ from the new generation compared to the outgoing sale prices on the existing generation. We might get lucky and see P10 come in at the $300 mark, but I would be surprised. A cut down P10 might hit that level. Either way, it's still a pretty massive increase in perf/$ relative to the launch price of Hawaii at $550.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Hopefully you are being sarcastic? .

A 390x for $250 that uses WAY WAY less power than a current 390x? Sign me up. That would leave me PSU room to get another in the future and crossfire it, something I couldn't do with a 390x.

I don't consider the 970 on the same level as a 390x, it certainly hasn't aged as well.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
I will not be shocked if Polaris 10/Ellsmere is actually the smaller one.

AMD brought a lot of confusion with this naming.
 
Feb 19, 2009
10,457
10
76
So all the talk about bringing VR down in cost and giving gamers a lot of perf/$ isn't just hype. Ofc as soon as Raja said that, it should have been plain obvious that Polaris is meant to be cheap and hence, only GDDR5.

I think I understand what's going on when they demo Hitman 1440p, outside looking into the boat scene with a ton of NPCs hidden.

That people, is the Primitive Discard Accelerator in action. On 390X, that scene will drop below 60 fps easy (I know, I have the game). But Polaris doesn't even take in those geometry into it's pipeline to process and then Z-cull, it discards it before rendering even occurs. This results is huge improvements to minimum FPS when the scene complexity is bottlenecking the engine.

It may well only be 390X class performance normally, but it will perform better at MIN FPS in GPU bound scenarios. That's what they played, those tricksters at AMD...

Just like their perf/w comparison at 60 fps vsync lock. They would have massively improved power gating and so if the GPU isn't running max load, power usage drop a lot.

However the curious thing is why there's so few SP, 232mm2 Polaris 10 with 14ff density can easily fit much more than that unless they have really changed the design so that each SP takes many more transistors.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Hopefully you are being sarcastic? Considering hot deals for 290s and 970s have hit this mark. If you couldn't get 390x performance for at or around that next gen, it will be a total failure.

Those are deals for 18(+ for the 390) month old cards! This stuff will be cheap eventually but when it first launches they will also be essentially AMD's only graphics cards.
(There's far, far too large of a jump from the die shrink to keep their old cards on sale in any sane sense.).

So there will be premiums on it, especially the larger ones. How much/where depends will of course rather depend on what NV do/how everything compares etc.
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
However the curious thing is why there's so few SP, 232mm2 Polaris 10 with 14ff density can easily fit much more than that unless they have really changed the design so that each SP takes many more transistors.
its cutdown version.
Full have 2560 or 2668SP.
 

jpiniero

Lifer
Oct 1, 2010
16,392
6,866
136
That must be why they demoed Hitman at 1440p - it's probally got higher clocks and more ROPs to improve the fill rate. Between that and the architecture improvements will have to be enough to make up the difference. The performance at 1080p is probably relatively lower.

Either way do not expect any favors on pricing. It'll just have to settle in where nVidia goes.

if there is any truth to this. another FAILED.

need more power, not less wattage.

If AMD's going to get a decent amount for RTG they are going to need Polaris to be a 'hit' (meaning reclaiming the share they've lost over the 2 years). For that they need OEM deals, and to get OEM deals they need less wattage.
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
A 390x for $250 that uses WAY WAY less power than a current 390x? Sign me up. That would leave me PSU room to get another in the future and crossfire it, something I couldn't do with a 390x.

I don't consider the 970 on the same level as a 390x, it certainly hasn't aged as well.

why not just buy a bigger psu? is the cost of wattage that prohibited?
 
Feb 19, 2009
10,457
10
76
why not just buy a bigger psu? is the cost of wattage that prohibited?

Probably the heat too. 2x 390X is going to warm your room up, and unless you live in cold places, it's not exactly a pro.

2x Polaris 10 would make for an awesome high perf and low power build.
 

stuff_me_good

Senior member
Nov 2, 2013
206
35
91
I don't get it, according to every "GPU specialist" in this forum, all seems to indicate that AMD is bringing with polaris only current gen performance with massive power reduction. Surely I don't think that they are all so stupid at AMD that think gamers will upgrade only to have same performance with less power draw?

Come on, surely not all here are also so stupid that all those new cards will not have any performance benefits from improved architecture?
 
Feb 19, 2009
10,457
10
76
I don't get it, according to every "GPU specialist" in this forum, all seems to indicate that AMD is bringing with polaris only current gen performance with massive power reduction. Surely I don't think that they are all so stupid at AMD that think gamers will upgrade only to have same performance with less power draw?

Come on, surely not all here are also so stupid that all those new cards will not have any performance benefits from improved architecture?

Folks on the high end aren't the target for Polaris 10 to upgrade. It's a very small two chip, 120mm2 and 232mm2, these classes are entry level and low-midrange.

Typically in the past, mid-range chips were ~300mm2 or even larger.

Folks on high-end 28nm, will have Vega 10 and 11 to look forward to.

Vega 11 should be the upper-midrange, ~400mm2 (my guess). Vega 10 the biggest one.
 

Adored

Senior member
Mar 24, 2016
256
1
16
I don't get it, according to every "GPU specialist" in this forum, all seems to indicate that AMD is bringing with polaris only current gen performance with massive power reduction. Surely I don't think that they are all so stupid at AMD that think gamers will upgrade only to have same performance with less power draw?

Come on, surely not all here are also so stupid that all those new cards will not have any performance benefits from improved architecture?

A lot of people don't seem to realise that 28nm was a very different node compared to all others in the past.

Historically you would expect a new node to be up against cards like the 780 Ti or GTX 580. But look at 28nm - what other node in history lasted so long and saw both companies push the die size to 600mm2, one completely new arch (Maxwell) while the other introduced a completely new type of memory?

If AMD or Nvidia can beat the current generation with their midrange it'll be because we're seeing their A-game and then some.