AMD Vega (FE and RX) Benchmarks [Updated Aug 10 - RX Vega 64 Unboxing]

Page 20 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

plopke

Senior member
Jan 26, 2010
238
74
101
Out of curiosity does the VEGA frontier card actually look like a good card for what it is aimed at ? Will people want to buy these instead of other amd pro cards or quadro? So amd can actually make money .... of those ???

I got of the vega hype train half a year ago. Before the miners explosion I was quiet pleased with the price/performance of RX470/570 , still hoping AMD can compete up to a 1070 at least on price/ performance, never was expecting anything close to a 1080Ti.
 

OatisCampbell

Senior member
Jun 26, 2013
302
83
101
I'm just posting verbatim what the guy said in the stream. I'm not manipulating anything. That is the state of Vega FE for gaming today 29/06/17. I suppose the results for the nV cards were taken on the same test bench (5960x + 1500w PSU + open air environment with a crapload of air for the cards to work their best)

He also said he went over the phone with AMD BEFORE showing that on the stream, and they were ok with it... and that he'd have it posted on his site by tomorrow.

AMD was OK with these results.

They know it is this bad.

I don't know why everyone is getting so angry about Vega.

If RX Vega is between a 1070 and 1080 for $450 it will sell. (And be a good deal) Most people don't buy $700 cards.
 

Armsdealer

Member
May 10, 2016
181
9
36
I don't know why everyone is getting so angry about Vega.

If RX Vega is between a 1070 and 1080 for $450 it will sell. (And be a good deal) Most people don't buy $700 cards.

The more interesting entrant is the 56 compute unit part. It should have over 90% the actual performance of the 64 unit part. If they sell it at 299 it's really a bargain.
 

Saylick

Diamond Member
Sep 10, 2012
3,162
6,388
136
The more interesting entrant is the 56 compute unit part. It should have over 90% the actual performance of the 64 unit part. If they sell it at 299 it's really a bargain.
I am not sure they can sell it at $299 considering that would place a lot of downward pressure on RX580 pricing, which itself is already a relatively low margin product (outside of the mining bubble). Also, not sure how much HBM2 costs, but these dies can't be cheap to produce (material, R&D, marketing, and shipping costs inclusive).

I think RX Vega will end up being somewhere midway between 1080 and 1080 Ti for 1080 price (~$500). The cut down RX Vega will be 1080 performance at 1070 price (~$400).
 

Armsdealer

Member
May 10, 2016
181
9
36
I am not sure they can sell it at $299 considering that would place a lot of downward pressure on RX580 pricing, which itself is already a relatively low margin product (outside of the mining bubble). Also, not sure how much HBM2 costs, but these dies can't be cheap to produce (material, R&D, marketing, and shipping costs inclusive).

Cost of each vega unit = (sunk cost / n) + cost to manufacture each unit

Where n is number of units sold. The sunk costs on vega are r&d, marketing, etc and probably amount to 300-500mm USD. Pascal by comparison cost "billions" enough to "go to mars" (jhh hyperbole).

Point being that amd - when you recognize how much the fixed costs are - HAS to move product just to minimize losses. They almost certainly won't make money on vega already given it's a year late. No surprise really...AMD hasn't been profitable as an enterprise for a long while.

As for the manufacturing cost per unit, excluding the fixed costs, it's definitely not 300 USD per unit. We know this because nVidia has a 60% gross margin on its total income statement. I would guesstimate the higher end cards are probably around or slightly above that number.

I think we're going to be pleasantly surprised by pricing - they only have a 6-12 month window to try to minimize losses before volta. Moreover rx 580 is a 220 USD msrp part. It would be silly to have a hole from 220 to 350 USD.
 
Last edited:
  • Like
Reactions: Bacon1

french toast

Senior member
Feb 22, 2017
988
825
136
I am not sure they can sell it at $299 considering that would place a lot of downward pressure on RX580 pricing, which itself is already a relatively low margin product (outside of the mining bubble). Also, not sure how much HBM2 costs, but these dies can't be cheap to produce (material, R&D, marketing, and shipping costs inclusive).

I think RX Vega will end up being somewhere midway between 1080 and 1080 Ti for 1080 price (~$500). The cut down RX Vega will be 1080 performance at 1070 price (~$400).
Unfortunately Vega is close to 500mm2 with hbm 2 and a non cheap PCB/cooler, that is not going to help amd much here but yes it would be great value.
 

Maverick177

Senior member
Mar 11, 2016
411
70
91
I wonder where's AMD PR department now? Their PR strategy was to be community oriented, but when we need an official statement, they are no where to be found lmfao. "We're uped our PR game", not really.
 

french toast

Senior member
Feb 22, 2017
988
825
136
I wonder where's AMD PR department now? Their PR strategy was to be community oriented, but when we need an official statement, they are no where to be found lmfao. "We're uped our PR game", not really.
Maybe there is nothing they can say? If this was a performance monster they would have revealed some gaming comparisons to stop people buying gtx 1080ti when it released, as they did not my conclusion was not favourable.
IPC comparisons to fury x are going to be revealing, how much has all that r&d and 2 years gained? Signs don't look good.
,
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Standard GCN behavior, fully render one triangle, render next, etc.
1vUrQ2K.gif


This looks very similar to how my Fury renders it, do you have the settings he was using there so I can verify it?
 

amenx

Diamond Member
Dec 17, 2004
3,907
2,127
136
Have feeling this card was rushed just so it would be out in H2. As if management gave them deadlines to meet. This is one of the worst product launches in history imo. Not so much the performance of the card, but AMDs inability to get anything out of it. As if it was a non-launch, but with the added effect of making the competition seem more appealing.
 

Gideon

Golden Member
Nov 27, 2007
1,643
3,680
136
In hindsight, I think it was a mistake to use 2 stacks of HBM2 on the package. I mean, with 4 stacks they could have at least doubled the bandwidth and lots of tasks scale very well with BW (ethereum mining the very least). Sure

With 4 stacks the GPUs would get away with 4-Hi stacks as well, therefore probably clocking a bit higher. They might have reduced clocks a bit to stay in the power budget, but at least then chip would have something going for it (undisputed BW leader). Sure it would cost more, but at least for HPC tasks they could also charge more for it.

Obviously AMD didn't know that all their arch-improvements will extract ~0% extra performance, but they should have known the ballpark. I mean they were aware of maxwell since at least February 18, 2014 and also Nvidia's pipeline. They should have known by then already that they will not be competitive with a Vega that's roughly as slow as Fury.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
I am not sure they can sell it at $299 considering that would place a lot of downward pressure on RX580 pricing, which itself is already a relatively low margin product (outside of the mining bubble). Also, not sure how much HBM2 costs, but these dies can't be cheap to produce (material, R&D, marketing, and shipping costs inclusive).

I think RX Vega will end up being somewhere midway between 1080 and 1080 Ti for 1080 price (~$500). The cut down RX Vega will be 1080 performance at 1070 price (~$400).

Looks optimistic to peg cut chips at full 1080 performance. Even if they do get there they'll struggle hugely.

NV are near clockwork so the 1160 will be +- 1080 perf, +- 120w and +- 1060 pricing. Maybe this year, definitely by next spring.

At that stage, and especially given the pretty ridiculous power draw gap? I really don't know.

Hopefully there are some compute tasks that Vega is really good at so they can get a decent return on R&D that way.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
In hindsight, I think it was a mistake to use 2 stacks of HBM2 on the package. I mean, with 4 stacks they could have at least doubled the bandwidth and lots of tasks scale very well with BW (ethereum mining the very least). Sure

They are already very close to the bandwidth of 1080 Ti / Xp with 2 stacks. 4 stacks would have been more complex and probably increased total build cost.

This honestly sounds like its not using correct drivers as none of the new features are working. Its acting like 30-40% OC'd Fury.

Does anyone know where the "Beyond3d Suite" of tools is for testing memory bandwidth and such?

Techreport used it here:

http://techreport.com/review/28513/amd-radeon-r9-fury-x-graphics-card-reviewed/4
 

QualityTime

Junior Member
Jun 29, 2017
7
0
1
This looks very similar to how my Fury renders it, do you have the settings he was using there so I can verify it?
For the most part it was at the defaults, except floats per vertex(slider a bit over halfway). However, there was some rendering initially with everything at default.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
For the most part it was at the defaults, except floats per vertex(slider a bit over halfway). However, there was some rendering initially with everything at default.

There are 3 sliders:

Triangle Count, Percent, Floats per vertex

With those I can recreate it on my card.
 

Glo.

Diamond Member
Apr 25, 2015
5,711
4,556
136
hitman.png

Are Nvidia workstation GPU drivers also good at gaming while AMD's aren't and you need the dedicated gaming card for gaming?

I don't think that's the case personally.
I don't think you need the "dedicated gaming card" to get a very close look at RX Vega's performance.
I will ask you, and others one question, to ponder about.

Is software using the hardware features that are increasing the performance?

I was watching before sleep the PCPer stream. And the GPU behaved just like OC'ed Fiji.

Even HBCC should add minimum framerates to performance. None, of new features are apparent to software, somehow.

Is it a problem with drivers? Unfortunately, I do not think so. Drivers may really increase performance, just like Ryan Shrout in the stream said, by 8-10% at best.

I was theorizing long ago, that current slack of software has to be rewritten to use features of Vega. Is it true? I guess we have to wait and see. Theoretically Prey should be Vega ready and optimized for Vega architecture, because of the deal AMD signed with Bethesda. Someone could test this, to confirm or disprove.

Its interesting to see 35% higher core clock than Fury X in some games(around 1.4-1.425 GHz), in GPU bound situation, and performance increase over it is within 15-20% range. Which is more than odd, considering the architecture layout on high level.
 
  • Like
Reactions: ZGR

Peicy

Member
Feb 19, 2017
28
14
81
I don't know why everyone is getting so angry about Vega.

If RX Vega is between a 1070 and 1080 for $450 it will sell. (And be a good deal) Most people don't buy $700 cards.
Remember that this card seems to draw up to 300w. Thats insane for this level of performance in this day and age.
 
  • Like
Reactions: Sweepr

SpaceBeer

Senior member
Apr 2, 2016
307
100
116
Custom GTX 1080 Ti models consume 250-290W, some of them ~320W in peak. Some better GTX 1080 models draw ~230W. I really don't think people who spend $1000+ on gaming PC care about ~60W. Especially when enitere system consumption remains at 500W or less. And I really don't know anyone who owns $500+ GPU with average 500W PSU. People who have bougth GTX 980 or GTX 1080 are using 80+ Gold 700W+ PSUs, even if entire system power consumption is ~400W. Don't make a big deal of this 300W.

The fact guy managed to run all tests without issues with 550W PSU tells a lot. Sure it's now good for PSU if it is under 90-100% load, but good 700-750W PSUs cost $100, and will be under 60-70% laod (peak). If you really expected Vega will be as efficient as Pascal, than I don't know what to say :)
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
Custom GTX 1080 Ti models consume 250-290W, some of them ~320W in peak. Some better GTX 1080 models draw ~230W. I really don't think people who spend $1000+ on gaming PC care about ~60W. Especially when enitere system consumption remains at 500W or less. And I really don't know anyone who owns $500+ GPU with average 500W PSU. People who have bougth GTX 980 or GTX 1080 are using 80+ Gold 700W+ PSUs, even if entire system power consumption is ~400W. Don't make a big deal of this 300W.

The fact guy managed to run all tests without issues with 550W PSU tells a lot. Sure it's now good for PSU if it is under 90-100% load, but good 700-750W PSUs cost $100, and will be under 60-70% laod (peak). If you really expected Vega will be as efficient as Pascal, than I don't know what to say :)

You may not care about inefficient GPUs but efficiency is a big deal and is the main reason AMD is struggling so much in the GPU space. Efficiency matters for mobile, desktop, and server. How? AMD can't make a faster PCIe card if they wanted to right now because they are right at the power limit. This also becomes problematic when it comes to cooling.

Effiency matters period/full stop.
 
  • Like
Reactions: Phynaz

cytg111

Lifer
Mar 17, 2008
23,204
12,852
136
You may not care about inefficient GPUs but efficiency is a big deal and is the main reason AMD is struggling so much in the GPU space. Efficiency matters for mobile, desktop, and server. How? AMD can't make a faster PCIe card if they wanted to right now because they are right at the power limit. This also becomes problematic when it comes to cooling.

Effiency matters period/full stop.

One should problary not begin to speculate about what this means for upcoming apus and their role in the mobile space....
 
  • Like
Reactions: Kuosimodo

SpaceBeer

Senior member
Apr 2, 2016
307
100
116
You may not care about inefficient GPUs but efficiency is a big deal and is the main reason AMD is struggling so much in the GPU space. Efficiency matters for mobile, desktop, and server. How? AMD can't make a faster PCIe card if they wanted to right now because they are right at the power limit. This also becomes problematic when it comes to cooling.

Effiency matters period/full stop.
Again, AMD can make card with same perf/watt as nVidia. Fury Nano had same per/watt as GTX 980. So, undervolted/downclocked Vega/Polaris can be used in notebooks, AiOs, servers, consoles, workstations. It seems average users don't think Apple, MS and Sony made right choice when they decided to use AMD chips in their low(er) power systems
https://hothardware.com/ContentImages/Article/2556/content/power-u.png
Total system power consumption with 28CUs P10 card (WX 5100) is less than 150W. Very power efficient, right?

Why AMD and AIB push cards to the limit? I don't know. But If I cared about power consumption, I would lower clocks/voltage and have 5-10% less performance and 30+% less power consumption. Same as what miners do
 
  • Like
Reactions: Bacon1

Gideon

Golden Member
Nov 27, 2007
1,643
3,680
136
As no one has brought this up yet I'm just throwing it out here ...

Perhaps all the performance optimization features are enabled (culling, rasterizer, etc...), and it needs each and every one of them just to keep up with Fury in performance per clock?
This is my worst nightmare if true, but perhaps they just pulled a Bulldozer? In their quest to get higher clocks for NCUs vs CUs, they failed to get them anywhere near the design target, yet took all the performance hits from this new high-clock design?

Vega%20Final%20Presentation-29.png


Don't get me wrong, I would absolutely hate it, if it turned out to be true, but NCUs actually being slower per clock, compared to CUs (in current games, before other optimizations), would at least explain the performance we're seeing.

After all, AMD has had working silicon for at least 7-8 months. If the consumer version would have something big driver improvements in store it would make no sense to hide it. What would happen, they would sell 2 less Frontier editions to gamers?

If anything is in store for the gaming version, it will be that the top model will be watercooled and can reach the 1600 mhz speed more easily. But overall it seems they missed their planned clock-targets (be it due to process or something else)
 

Veradun

Senior member
Jul 29, 2016
564
780
136
As no one has brought this up yet I'm just throwing it out here ...

Perhaps all the performance optimization features are enabled (culling, rasterizer, etc...), and it needs each and every one of them just to keep up with Fury in performance per clock?
This is my worst nightmare if true, but perhaps they just pulled a Bulldozer? In their quest to get higher clocks for NCUs vs CUs, they failed to get them anywhere near the design target, yet took all the performance hits from this new high-clock design?

Vega%20Final%20Presentation-29.png


Don't get me wrong, I would absolutely hate it, if it turned out to be true, but NCUs actually being slower per clock, compared to CUs (in current games, before other optimizations), would at least explain the performance we're seeing.

After all, AMD has had working silicon for at least 7-8 months. If the consumer version would have something big driver improvements in store it would make no sense to hide it. What would happen, they would sell 2 less Frontier editions to gamers?

If anything is in store for the gaming version, it will be that the top model will be watercooled and can reach the 1600 mhz speed more easily. But overall it seems they missed their planned clock-targets (be it due to process or something else)

I'm still on #teamrespin. That would explain a lot: performance, power draw, delay, low volume.

It's obviously only a guess, one many independently made, because it makes sense. Let's wait and see :>
 
  • Like
Reactions: Gideon
Status
Not open for further replies.