AMD Vega (FE and RX) Benchmarks [Updated Aug 10 - RX Vega 64 Unboxing]

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Head1985

Golden Member
Jul 8, 2014
1,864
689
136
One of my worries is coming to fruition. From Techreport:

"In an interesting disclosure, AMD says the Vega FE will have peak polygon throughput of 6.4 GTris/s, a figure that may belie the fundamental organization of the Vega GPU as four main shader clusters."

It appears that Vega still only does four triangles per clock, which is hugely disappointing given that this isn't nearly enough for a high end card in 2017. This is what held the Fury cards back. The competing GTX 1070, 1080 and 1080 Ti can all do over 10 GTris/s, which isn't possible with the existing Radeon front end unless clock speeds go above 2500 Mhz. It looks like this is a hint that the geometry shaders aren't working on general gaming workloads.
I think GTX1070 can only manage 3triangles/clock with 3x GPC and GTX2070 will be probably again 3x gpc with 3x triangles/clock.
But yeah 4x SE will be probably again bottleneck.AMD need 6x or better 8x shader engines.
 

beginner99

Diamond Member
Jun 2, 2009
5,211
1,581
136
I wonder why you did not brought what Ryan Smith from this very site wrote on Beyond3D forum. You are there, as well.
This is what he wrote: https://forum.beyond3d.com/posts/1989164/

https://forum.beyond3d.com/posts/1989183

Ryan Smith said:
Oh it's totally valid to benchmark it. I just have to tread a little more carefully; as AnandTech people often take what we say as the final word, and while I'm proud of the trust readers have in us, it means I need to be careful not to write a preview article and have 10 news aggregators posting "AnandTech proves that Vega sucks!" in the morning.

Meaning he did to some internal tests and vega does in fact sucks at gaming with current drivers.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
c7dcff98887ed968f5084de0a93154d7291732c68551f963ce09e9805c79b11c.jpg


a61d9bc7d8147857591d7b709b19504963a41f6bfacce8723f55056421e9d0c1.jpg



de4ed93ec1511e66320b21a4b5de221128e01e8cccd9d8e10e57d970a3155480.jpg


5a1d18d19c707cb80d42a236bbfdc9f769da245554b4060706e4bfca747c98e9.gif



Not mine..but we should see some numbers soon.
 
Last edited:

OatisCampbell

Senior member
Jun 26, 2013
302
83
101
Meaning he did to some internal tests and vega does in fact sucks at gaming with current drivers.

Not necessarily.

His point a lot of people will draw that conclusion if Vega isn't beating 1080Ti with this prosumer card is probably valid.

If AMD can release a consumer card for $499 that competes with 1080 non Ti, and beats it at some games, that wouldn't "suck". It would actually be pretty great as most people don't have 4K monitors and that level of performance would serve everyone else well.
 

Mopetar

Diamond Member
Jan 31, 2011
7,911
6,178
136
HBM2 from Micron and not Samsung/Hynix? That's interesting.

Could be GPU-Z misreporting the information. This is a brand new card, so I don't expect them to have full information. It also reports 1600 MHz for the default clock, when I thought some information from AMD had 1600 MHz at the peak clock with it potentially running below that if it starts to throttle for whatever reason.
 

Oddzz

Junior Member
Mar 15, 2017
21
16
41
At least you should have quoted what the tester said before people freak out about the results:
Sorry it took time but never used wattman for those cards

FireStrike test in game mode (pro mode does not make sense), frequency on GPU all over the place, I am trying to hard set it to 1600 and on the next run, as of now score is not impressive. But colud be my fault.
....

guys, let me try to set it correctly as i never used Wattman and this card is fresh in. I am running as well just 550W PSU which could be problem as well. I keep trying :)
 

xpea

Senior member
Feb 14, 2014
430
135
116
So barley beats a GTX 1080 at firestrike. No wonder they did not sample any to reviewers.
No surprise. And please keep away the "driver not ready " non sens. We know that the thing is working since last December...
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
If Vega 10 is barely faster than GTX 1080 then 2018 will turn out to be very ugly for AMD against Volta which is likely to bring another 40-50% performance across the stack for Nvidia. If Vega cannot match GTX 1080 Ti then a post GCN architecture is required for AMD to get back into the high end GPU market. Its looking like Vega might turn out to be the worst dud after Bulldozer from AMD.
 

Glo.

Diamond Member
Apr 25, 2015
5,726
4,604
136
Let me remind everyone, what Ryan Smith from Anandtech has written on Beyond3D forum.

Ryan Smith said:
Oh it's totally valid to benchmark it. I just have to tread a little more carefully; as AnandTech people often take what we say as the final word, and while I'm proud of the trust readers have in us, it means I need to be careful not to write a preview article and have 10 news aggregators posting "AnandTech proves that Vega sucks!" in the morning.:eek: Most readers get that a preview is a preview, but not everyone does.

(Also, I'm on vacation next week anyhow :p )
I suggest getting patience, and not jumping into conclusion about the architecture performance, yet.
 

tg2708

Senior member
May 23, 2013
687
20
81
doesn't fire strike factor in cpu score for total overall score though so I think thats a good graphics score. I remain optimistic that the strictly gaming variant will be faster with the same drivers. Lets go AMD "pull a rabbit out of a hat" so to speak.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
If Vega 10 is barely faster than GTX 1080 then 2018 will turn out to be very ugly for AMD against Volta which is likely to bring another 40-50% performance across the stack for Nvidia. If Vega cannot match GTX 1080 Ti then a post GCN architecture is required for AMD to get back into the high end GPU market. Its looking like Vega might turn out to be the worst dud after Bulldozer from AMD.

It blows my mind that people's expectations were as high as 1080 TI plus to begin with. Polaris didn't come close to AMD's touted perf/w improvements, Polaris 11 is 30% slower than GP107 at the same power draw, and even updated Polaris 10 is drawing 85% more power, is 15% larger die, and needs 25% more bandwidth vs. GP106 for only a 3-5% (average) advantage.
 
  • Like
Reactions: Sweepr

tamz_msc

Diamond Member
Jan 5, 2017
3,825
3,654
136
I think this so-called "game mode" is noting but a toggle to allow for the usual 3D benchmarking software and games to just work in a serviceable way. It is basically a modified Pro driver.

Note that my statement doesn't imply that somehow the 'gaming' Vega would offer radically different performance.
 

beginner99

Diamond Member
Jun 2, 2009
5,211
1,581
136
I suggest getting patience, and not jumping into conclusion about the architecture performance, yet.

So you say the can make it magically 30% faster within one month purely from driver optimizations? Why didn't they do that during the last 3+ month if it were that easy?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
If you are actually following this guy's disqus thread, he has stated that the core frequencies are "all over the place" and he can't keep it at 1600MHz. That suggests a problem with the FS score he posted, no matter the state of the drivers.

Or the problem is that Vega FE can't maintain it's max boost clock with a blower cooler and we have, yet again, disingenuous advertising by AMD on clock speeds.
 

Head1985

Golden Member
Jul 8, 2014
1,864
689
136
Its just TDP limited thats why vega cant hold 1600Mhz.
Firestrike GPU score is just bad.Very bad.
GTX1070 oc can manage 21000 GPU score.

This is worse than r600/2900xt or geforceFX.Its probably also eats 300w for around gtx1080 performance.
 

Mopetar

Diamond Member
Jan 31, 2011
7,911
6,178
136
No surprise. And please keep away the "driver not ready " non sens. We know that the thing is working since last December...

If the drivers were ready, AMD would be releasing RX Vega as well. Even if you want to claim that HBM2 supply is the limiting factor, RX Vega will only ship with 2 or 4 GB stacks which I would imagine are much easier to produce than the full 8 GB stacks being used on the FE card. Based on rumors and reports surrounding Ryzen, the yields should be good enough to get a sufficient number of working silicon, and I expect RX Vega will offer 48 and 56 CU versions as well to salvage some of the defective parts, so even if yields aren't as great as anticipated, there should be a lot of those binned chips as well.

When they first showed off Vega they said it was running using older Fury drivers. Just because you've got enough of a driver so that it can function without crashing doesn't mean that it's optimal. Under the hood its still GCN. Personally I don't think another month will make a lot of difference and that Vega drivers are going to take much longer to get the most of the card. I suppose you can look at it in that they will probably age better, but I'm not going to pay for potential future performance so they'll need to sell it based on whatever performance they can get when it ships.
 
  • Like
Reactions: Phynaz

Glo.

Diamond Member
Apr 25, 2015
5,726
4,604
136
I think this so-called "game mode" is noting but a toggle to allow for the usual 3D benchmarking software and games to just work in a serviceable way. It is basically a modified Pro driver.

Note that my statement doesn't imply that somehow the 'gaming' Vega would offer radically different performance.
I suggest reading about the game mode. It is designed to TEST development stage, and performance of the developed game, not to game on this GPU.
So you say the can make it magically 30% faster within one month purely from driver optimizations? Why didn't they do that during the last 3+ month if it were that easy?
Because the GPU architecture is so different. Secondly, does the application use the features that increase performance of the architecture: Tile Based Rasterization, Memory Paging System, Primitive Shaders?

Driver optimization can help in load balancing, but the applications have to be rewritten to use the architecture features, especially Primitive Shaders.
Or the problem is that Vega FE can't maintain it's max boost clock with a blower cooler and we have, yet again, disingenuous advertising by AMD on clock speeds.
Or that that guy has 550W PSU, which he mentioned, and this is the reason why the GPU is not able to maintain the clocks?
 
  • Like
Reactions: DarthKyrie
Status
Not open for further replies.