AMD Polaris Thread: Radeon RX 480, RX 470 & RX 460 launching June 29th

Page 18 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Looks good. I expect performance to be inline with the 390X at a significantly reduced price.

I do think that there will be significant competition with GP106 assuming Nvidia includes a significant number of SPs on die.
 

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
I'm surprised more people aren't upset AMD aren't going for more performance. We've had this performance before, even close to the same TDP. It is nothing new. Sure, they are lowering prices, but that is happening across the board. Seems like this is an upgrade for those with 380X and below.

Well, we've been hearing for awhile this was going to be their mainstream level, and VEGA is supposed to be their enthusiast stuff, so I don't think anyone is really surprised or upset. Though the price is a bit of a surprise.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Looks good. I expect performance to be inline with the 390X at a significantly reduced price.

I do think that there will be significant competition with GP106 assuming Nvidia includes a significant number of SPs on die.

Yeah, agreed.

GP106 will have its work cut out for it if it is 1/2 the 1080 (as rumored). GX480 likely will be a better option IMHO...
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
As an enthusiast I am not really qualified to critizise AMD's design choice but damn! Why not target 300mm2 die size for mid range Polaris 10? Polaris 11 still could have been kept the same for laptops but if this chip is going in to desktops and GloFo was already getting good yields based on Samsung's results.. I just think AMD undershot the sweet spot here. A ~3200sp part with the same 8-pin connector as the 1070 would have been a sweet card and they could have easily charged $400 for it and still under cut NV.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
As an enthusiast I am not really qualified to critizise AMD's design choice but damn! Why not target 300mm2 die size for mid range Polaris 10? Polaris 11 still could have been kept the same for laptops but if this chip is going in to desktops and GloFo was already getting good yields based on Samsung's results.. I just think AMD undershot the sweet spot here. A ~3200sp part with the same 8-pin connector as the 1070 would have been a sweet card and they could have easily charged $400 for it and still under cut NV.

Well, yes they have undershot vs the optimal strategy (see what the massively well resourced competition is doing), but AMD aren't in a position to do optimal things just now.

They're heavily resource/finance constrained and needed to do these chips for the console (and other custom stuff) deals.

Also can't be sure quite what that 14nm process at GF can cope with for the moment. Moving the process from Samsung to GF isn't a totally trivial matter :) Enormously complex things.

Just be glad they're there giving competition I think.

GP106 is going to be its own chip - we've even seen it in those drive modules. Going by the gap to the 1070 you presume it'll be similar performance to this stuff.

The relative power draws will be interesting to see once reviews are out. Not critical for specifically these chips but AMD really do need to be plausibly close or it'll be a big problem for other places.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
The video shows concurrent & therefore asynchronous execution of compute and graphics tasks. How the HW decides to schedule tasks and where it's an implementation detail. The fact the AMD might be better than NVIDIA at this is irrelevant, both architectures support providing better utilization of the available computational resources by scheduling for executions independent compute and graphics tasks at the same time.

The video makes absolutely zero indications of whether or not things are running concurrently, to be able see that we would need a screengrab of GPUview or similar. It may or it may not run concurrently, but the video doesn't make it clear.

The fact that AMD might be better is obviously not irrelevant, I don't even know how to respond to that.

And I never said Pascal didn't implement async compute in a way that gives them better utilization (dynamic load balancing is obviously all about achieving better utilization), I said Pascal implements it in a way that doesn't allow concurrent graphics and compute within a single SM.

Last time I checked an HW vendor doesn't get to dictate how a feature is best implemented on their competitors ' architecture. By following the same reasoning then AMD doesn't support tessellation or color compression because they are not very good at it?

I have no idea where this weird strawman comes from?

And I never said that Nvidia doesn't support async compute, for the umpteenth time I said that they don't support running graphics and compute concurrently within a single SM (something that AMD does using async compute).

I like the facts very much, thank you. I am certainly not the one in denial that makes up definitions of what it means to concurrently run independent workloads on a massively parallel computer architecture. Give me a break, the case is closed. Stop propagating AMD FUD and move on.

But apparently you don't like reading what people actually write. I never said that Nvidia can't run jobs concurrently (be it graphics or compute), I specifically said that they can't run graphics and compute concurrently within a single SM.

If all you have is strawmen and empty accusations of FUD, then it's pretty obvious who's in denial.
 

sze5003

Lifer
Aug 18, 2012
14,320
683
126
I've been wondering if this might happen.

So many speak about the Halo effect and for sure, it exists. I have seen little mention of an "Ascension effect", where a high performance lower class product makes us anticipate higher end follow-ups.

Just imagine what is coming with Vega if we get this with Polaris.
I don't see why it wouldn't happen basing how they are pricing their entry card. I'm not going to wait for Vega as I want to upgrade my card soon but I'm sure they will offer something priced nicely to compete with the 1080 by June 29th.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I don't see why it wouldn't happen basing how they are pricing their entry card. I'm not going to wait for Vega as I want to upgrade my card soon but I'm sure they will offer something priced nicely to compete with the 1080 by June 29th.

What!?

Pretty sure AMD would have presented this if this was the case. 2x GX480 will be the 1080 competitor until Vega...
 

sze5003

Lifer
Aug 18, 2012
14,320
683
126
What!?

Pretty sure AMD would have presented this if this was the case. 2x GX480 will be the 1080 competitor until Vega...
I don't know, so that's it all they are offering is the 480 at different memory allocations? What if people don't want to run crossfire? I never intend to do that. But I'm sure others and myself included want to upgrade at the same time not wait for 2017.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I don't know, so that's it all they are offering is the 480 at different memory allocations? What if people don't want to run crossfire? I never intend to do that. But I'm sure others and myself included want to upgrade at the same time not wait for 2017.

I don't disagree that a ~3200sp part may be on the horizon, but I would be blown-away if that launched this month on the 29th with the GX480. If that was the case, I am sure they would have given that product some limelight in the presentation we say yesterday.
 

sze5003

Lifer
Aug 18, 2012
14,320
683
126
I don't disagree that a ~3200sp part may be on the horizon, but I would be blown-away if that launched this month on the 29th with the GX480. If that was the case, I am sure they would have given that product some limelight in the presentation we say yesterday.
I'm planning on upgrading by the end of June / early July. It just sucks that I'm always waiting Everytime I want want to change parts lol
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Well, yes they have undershot vs the optimal strategy (see what the massively well resourced competition is doing), but AMD aren't in a position to do optimal things just now.

They're heavily resource/finance constrained and needed to do these chips for the console (and other custom stuff) deals.

Also can't be sure quite what that 14nm process at GF can cope with for the moment. Moving the process from Samsung to GF isn't a totally trivial matter :) Enormously complex things.

Just be glad they're there giving competition I think.

GP106 is going to be its own chip - we've even seen it in those drive modules. Going by the gap to the 1070 you presume it'll be similar performance to this stuff.

The relative power draws will be interesting to see once reviews are out. Not critical for specifically these chips but AMD really do need to be plausibly close or it'll be a big problem for other places.

You may be right, Polaris 10 may be the exact size/specification that Sony/Apple/Microsoft/Nintendo wanted in their respective designs so that's what AMD decided to build and then also leverage in the desktop GPU market.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
Seeing a lot of disappointment in different forums. That has to do with the huge expectations around this card, many expecting Fury X+ all-around performance. Of course we have yet to see numbers, but Geforce GTX 970/980 performance (told to the press in the Macau event according to Videocardz) is not out of reach for GP106, especially if rumours of a 192-bit 6GB VGA replacing GM204 are correct.
 

renderstate

Senior member
Apr 23, 2016
237
0
0
The video makes absolutely zero indications of whether or not things are running concurrently, to be able see that we would need a screengrab of GPUview or similar. It may or it may not run concurrently, but the video doesn't make it clear.
So how is the app gaining 15-20% performance if not by concurrently running tasks? Moreover NVIDIA showed diagrams in the same session with the press to
Illustrate concurrent execution of graphics and compute. The demo is just a follow up to that. The whole video is on YouTube.

Regarding perf improvements imagine AMD new architecture is much better at filling their cores with gfx tasks (since parts like Fury X demonstrate they are not very great or at that on large configurations). This means the opportunities for concurrently scheduling compute and gfx workloads are reduced and therefore you might see a lower perf improvement from async compute on their newer GPUs! If that happens are you suddenly going to say their implementation suck or will you congratulate them with fixing a major issue of their architecture?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I don't see why it wouldn't happen basing how they are pricing their entry card. I'm not going to wait for Vega as I want to upgrade my card soon but I'm sure they will offer something priced nicely to compete with the 1080 by June 29th.

If you have a 1080p 60Hz monitor, you have 2 solid options. Get a cheap 480, use it for 1-1.5 years and upgrade in 2017 when real big flagships come out. Another option is to get a 1070 and keep it for 3 years. Most important for 1080p 60Hz gamers is to look at FPS not just % charts. At this resolution one card could be 50% faster but it could be 90 FPS vs. 60 FPS. We could also see more leaks on the 1060/1060Ti over the next month. If you are eyeing a monitor upgrade over the next 1-2 years, then FreeSync vs. GSync has to be factored in by now. We are seeing 4K IPS FreeSync monitors down to <$400. RX480 and 1070 won't last 5-8 years - a good monitor could. Based the # of people in the world still using 1080p or lower, it seems 5-8 years useful life out of a monitor may actually not be too far fetched.

Imo, NV FE cards are overpriced so you'd want a 1070 AIB card. Wait until June 29th to see reviews of 480 vs. 1070, see where FPS land for 1080p and take your pick. At the very least you should have more 1070 AIB options to choose from. Another factor is we may or may not see 480 AIB cards on June 29th. I still would not recommend a $400+ card for 1080p 60Hz gaming right now. Until 1070 even came out, who on this forum praised/recommended 980Ti/Fury X/Titan X for 1080p 60Hz? If you want to upgrade once and keep the card for 4 years like you kept your 7970Ghz, then probably a 1070 makes more sense if it's at least 40% faster for the price premium. If it's only 25-28% faster, that's a heck of a lot of $ to pay. Selling your 7970Ghz now for $80-100 should be easy as miners will buy it. That would make 1070 a $300 upgrade or so. All depends on your budget.
 
Last edited:

sze5003

Lifer
Aug 18, 2012
14,320
683
126
If you have a 1080p 60Hz monitor, you have 2 solid options. Get a cheap 480, use it for 1-1.5 years and upgrade in 2017 when real big flagships come out. Another option is to get a 1070 and keep it for 3 years. Most important for 1080p 60Hz gamers is to look at FPS not just % charts. At this resolution one card could be 50% faster but it could be 90 FPS vs. 60 FPS. We could also see more leaks on the 1060/1060Ti over the next month. If you are eyeing a monitor upgrade over the next 1-2 years, then FreeSync vs. GSync has to be factored in by now. We are seeing 4K IPS FreeSync monitors down to <$400. RX480 and 1070 won't last 5-8 years - a good monitor could. Based the # of people in the world still using 1080p or lower, it seems 5-8 years useful life out of a monitor may actually not be too far fetched.
Very good advice as usual. I may be going to a 1440p monitor by the end of the summer. Depends on how much gaming I'll have time to do on the PC. For now I guess I'll wait a bit more into July and either get a 1070 or just see what else happens.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
So how is the app gaining 15-20% performance if not by concurrently running tasks? Moreover NVIDIA showed diagrams in the same session with the press to
Illustrate concurrent execution of graphics and compute. The demo is just a follow up to that. The whole video is on YouTube.

Regarding perf improvements imagine AMD new architecture is much better at filling their cores with gfx tasks (since parts like Fury X demonstrate they are not very great or at that on large configurations). This means the opportunities for concurrently scheduling compute and gfx workloads are reduced and therefore you might see a lower perf improvement from async compute on their newer GPUs! If that happens are you suddenly going to say their implementation suck or will you congratulate them with fixing a major issue of their architecture?

Concurrently across the GPU sure*, concurrently within individual SMs no. And this is the whole point of AMDs claims (which you were calling lies).

As for your later point, I would congratulate them obviously. AMD "fixing" their utilization doesn't mean that their ability to do concurrent graphics and compute within a CU suddenly sucks, it just means they would gain less from it, but since no modern architecture (including Nvidia's) will ever achieve 100% utilization, every little bit helps.

*Again the video doesn't actually show it, and there are tons of different ways to achieve a 15-20% boost from improved scheduling that doesn't involve concurrency, especially in synthetic tests.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Seeing a lot of disappointment in different forums. That has to do with the huge expectations around this card, many expecting Fury X+ all-around performance. Of course we have yet to see numbers, but Geforce GTX 970/980 performance (told to the press in the Macau event according to Videocardz) is not out of reach for GP106, especially if rumours of a 192-bit 6GB VGA replacing GM204 are correct.

Almost no one who took the time to carefully study AMD's statements and Polaris 10 specs expected Fury X and especially not Fury X+ levels of performance out of a <240mm2 die, 2304 shaders with a 256-bit bus, back then rumoured 1.05Ghz-1.1Ghz clocks, and regular GDDR5. The only people who did where those who never followed the leaks/news carefully or those who were dreamers. Even then I would say some of them were realistic and expected higher end 175W 2560 shader Polaris 10 to be Fury X+. Very few people made such statements about a cut-down P10. The performance is exactly where many of us predicted it would land. The price is the surprising part.

This card was never meant to be an upgrade for 970/290 or above users. Even if the 1060 shows up, 480 $249 version will more likely than not have better DX12 performance. There are rumors 1060 won't even launch until August. By the time it comes out, we may even have $229 480 8GB parts. Either way, NV loyalists will always either wait for NV's cards in their budget or will pay more for similar or less performance. AMD isn't after those customers. It's unfortunate 1070/1080 aren't getting higher end competition to help drive prices down and give consumers more choices but if Vega was designed with HBM2, they have no choice.

Kyle's rant at H now looks stupid as it's crystal clear the RX 480 was specifically designed from day 1 to hit $199-249 price levels.
 
Last edited:

Denly

Golden Member
May 14, 2011
1,436
229
106
They will do wonder for companies hanging on to their older workstation with old quadro like 2000/4000 or some normal video card(a lot of company use regular card for cost reason)

Dell T5500/5600/7500/7600 HP Z610/620/810/820 offer dual 150w slots I think and P10 will them them new life.

P11 1/2 height will also give corp SFF new life on those older 1st/2nd gen i3/5
 
Last edited:

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
If you have a 1080p 60Hz monitor, you have 2 solid options. Get a cheap 480, use it for 1-1.5 years and upgrade in 2017 when real big flagships come out. Another option is to get a 1070 and keep it for 3 years. Most important for 1080p 60Hz gamers is to look at FPS not just % charts. At this resolution one card could be 50% faster but it could be 90 FPS vs. 60 FPS. We could also see more leaks on the 1060/1060Ti over the next month. If you are eyeing a monitor upgrade over the next 1-2 years, then FreeSync vs. GSync has to be factored in by now. We are seeing 4K IPS FreeSync monitors down to <$400. RX480 and 1070 won't last 5-8 years - a good monitor could. Based the # of people in the world still using 1080p or lower, it seems 5-8 years useful life out of a monitor may actually not be too far fetched.

Imo, NV FE cards are overpriced so you'd want a 1070 AIB card. Wait until June 29th to see reviews of 480 vs. 1070, see where FPS land for 1080p and take your pick. At the very least you should have more 1070 AIB options to choose from. Another factor is we may or may not see 480 AIB cards on June 29th. I still would not recommend a $400+ card for 1080p 60Hz gaming right now. Until 1070 even came out, who on this forum praised/recommended 980Ti/Fury X/Titan X for 1080p 60Hz? If you want to upgrade once and keep the card for 4 years like you kept your 7970Ghz, then probably a 1070 makes more sense if it's at least 40% faster for the price premium. If it's only 25-28% faster, that's a heck of a lot of $ to pay. Selling your 7970Ghz now for $80-100 should be easy as miners will buy it. That would make 1070 a $300 upgrade or so. All depends on your budget.

Totally agree.

For anyone using a 1080P/60hz setup, this will be THE option for the next few months. Unless you can snag a used 390/390x for <$150, this will be perfect for those users needs. :)
 

DamZe

Member
May 18, 2016
188
84
101
Overall I am very pleased by this announcement, 980/390X performance for 200$ is unprecedented, AMD really needs this. One can only hope now that people start buying more RX480s, it would introduce some well needed competition in the discrete GPU market, win back some market share for AMD and make NVidia sweat a Little, because let's face it prices gouging on these "high" end non big part from NVidia was solely due to AMD not having the performance/watt metrics it needed. Now having improved on this aspect, AMD can potentially again compete in the perf/watt segment, finally forcing NVidia to lower their prices to compete. GCN has always intrigued me, but it&#8217;s power consumption was way off versus what NVidia was offering, so glad AMD now can offer a very efficient card.
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
So how is the app gaining 15-20% performance if not by concurrently running tasks? Moreover NVIDIA showed diagrams in the same session with the press to
Illustrate concurrent execution of graphics and compute. The demo is just a follow up to that. The whole video is on YouTube.

Regarding perf improvements imagine AMD new architecture is much better at filling their cores with gfx tasks (since parts like Fury X demonstrate they are not very great or at that on large configurations). This means the opportunities for concurrently scheduling compute and gfx workloads are reduced and therefore you might see a lower perf improvement from async compute on their newer GPUs! If that happens are you suddenly going to say their implementation suck or will you congratulate them with fixing a major issue of their architecture?

None of this is on topic for this thread. We do have a thread for this subject already if you want to continue the discussion there. I've replied to your quote there. http://forums.anandtech.com/showthread.php?t=2467957

On topic. AMD is showing good business sense here. Play to the vast majority of the market. I don't know what the VR take rate is going to be, but regardless the price to performance of the RX 480 is great. These should sell extremely well.
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
5,204
5,614
136
Almost no one who took the time to carefully study AMD's statements and Polaris 10 specs expected Fury X and especially not Fury X+ levels of performance out of a <240mm2 die, 2304 shaders with a 256-bit bus, back then rumoured 1.05Ghz-1.1Ghz clocks, and regular GDDR5. The only people who did where those who never followed the leaks/news carefully or those who were dreamers. Even then I would say some of them were realistic and expected higher end 175W 2560 shader Polaris 10 to be Fury X+. Very few people made such statements about a cut-down P10. The performance is exactly where many of us predicted it would land. The price is the surprising part.

This card was never meant to be an upgrade for 970/290 or above users. Even if the 1060 shows up, 480 $249 version will more likely than not have better DX12 performance. There are rumors 1060 won't even launch until August. By the time it comes out, we may even have $229 480 8GB parts. Either way, NV loyalists will always either wait for NV's cards in their budget or will pay more for similar or less performance. AMD isn't after those customers. It's unfortunate 1070/1080 aren't getting higher end competition to help drive prices down and give consumers more choices but if Vega was designed with HBM2, they have no choice.

Kyle's rant at H now looks stupid as it's crystal clear the RX 480 was specifically designed from day 1 to hit $199-249 price levels.
Looking at the 8GB RX480 as a $249 model max, don't you think that there will be a gaping price hole in the product stack if small Vega is HBM2?


  1. Small Vega uses GDDR5X with HBM2 solely for big Vega. If this is true then we can expect small Vega very soon as no memory supply issue.
  2. Small Vega uses HBM2. If this is true, there is probably an additional Polaris 10 model with higher performance. More shaders and higher clocked?
My impression is that AMD is very focused. Probably for the first time in years, so no gaping holes in the product lines.


I couldn't help but go see what's up with H. He's actually still defending his piece and insulting posters as normal.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Looking at the 8GB RX480 as a $249 model max, don't you think that there will be a gaping price hole in the product stack if small Vega is HBM2?


  1. Small Vega uses GDDR5X with HBM2 solely for big Vega. If this is true then we can expect small Vega very soon as no memory supply issue.
  2. Small Vega uses HBM2. If this is true, there is probably an additional Polaris 10 model with higher performance. More shaders and higher clocked?
My impression is that AMD is very focused. Probably for the first time in years, so no gaping holes in the product lines.


I couldn't help but go see what's up with H. He's actually still defending his piece and insulting posters as normal.

Very likely.

#1 if Vega launches in 2016/VERY early 2017.
#2 if Vega is late 1Q2017 or later. (higher P10 model launched later in 2016)

Just a guess...

Higher SKU P10 may get GDDRX or maybe even a refresh of P10 next year to GDDRX (GX570 or something)
 

flopper

Senior member
Dec 16, 2005
739
19
76
Kyle's rant at H now looks stupid as it's crystal clear the RX 480 was specifically designed from day 1 to hit $199-249 price levels.

Kyle needs a doctor, he seems to have adhd which does strange things to your head and perception.
I worked with a few.

it seems obvious amd have covered everything up and beyond 500$ with the 480.
 
Status
Not open for further replies.