AMD Vega (FE and RX) Benchmarks [Updated Aug 10 - RX Vega 64 Unboxing]

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
The demo of Vega on Star Wars Battlefront that was done 6 or so months ago was done using Fury drivers put into a debug mode. That was straight from AMD themselves.
I understand that, but what implications does that have for its performance? 5%, 10%, 50% less than it should? What specifically will it gain from new drivers?

Plus, as far as I know, there's a launch driver for Vega that all the tests so far have used. Is this the modified Fury driver still?
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
Obviously none, which is why I'm sure you are still running (pre)launch drivers for your GPU right? :rolleyes:

They are using game ready drivers from January.
Personally, I've never noticed any performance difference from new drivers. Mainly it's at best a 10-15% gain in certain situations over the course of months. You're acting as though there's some kind of massive driver that's going to turn Vega into a GP102 competitor. I want to know why you believe that.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Personally, I've never noticed any performance difference from new drivers. Mainly it's at best a 10-15% gain in certain situations over the course of months. You're acting as though there's some kind of massive driver that's going to turn Vega into a GP102 competitor. I want to know why you believe that.

I said its missing out on a lot of performance. I'd consider even 10%-15% to be a fairly big gain wouldn't you?

https://www.computerbase.de/thema/grafikkarte/rangliste

I mean the difference between a 1070 and 1080 is 20% and 1080 and 1080 Ti ~20% as well.

The testing we've seen done is all over the place with no "control". You can't compare two different test systems tested in different places and assume they are correct. You have to control your tests. We are also seeing them use drivers which are very out of date and we've known they would be for a while now when Raja said that in an AMA that drivers were taking a while and that RX would be faster and cheaper than FE.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
Obviously none, which is why I'm sure you are still running (pre)launch drivers for your GPU right? :rolleyes:

They are using game ready drivers from January.
The GPU MAY use, not "uses for sure", drivers from January.

Nothing else makes sense in terms of performance, when you will look at the architecture and affect of features on performance.
Could you link to this please? I'd really like to know how they are doing it without emulation.
They did similar thing with previous versions of GCN, when the drivers were not ready for launch, for particular games.

Hence sometimes sub-par performance on some GPUs(Fiji, for example), which is solved with next release of drivers.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
10-15% would be nice, but even that seems doubtful to me.

Unless RX Vega is going to be using a different chip from FE (which it isn't) I doubt the performance differences will be much to write home about. Especially since the FE is already clocked to the hilt.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
They did similar thing with previous versions of GCN, when the drivers were not ready for launch, for particular games.

Hence sometimes sub-par performance on some GPUs(Fiji, for example), which is solved with next release of drivers.

That was GCN though, all versions of GCN use the same base instructions set, registers, etc.. If Vega is truly new then I don't see how that would work. Perhaps Vega isn't all that new, it's just a revision of GCN then?
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
The demo of Vega on Star Wars Battlefront that was done 6 or so months ago was done using Fury drivers put into a debug mode. That was straight from AMD themselves.

Do you know how that works if Vega is truly a new architecture? It would be like trying to run an Nvidia driver on an AMD card, everything would be different.

Perhaps whoever said that misspoke, or was drawing a corollary, or maybe the Vega driver is a branch from the Fury driver.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
That was GCN though, all versions of GCN use the same base instructions set, registers, etc.. If Vega is truly new then I don't see how that would work. Perhaps Vega isn't all that new, it's just a revision of GCN then?
It's still GCN architecture at its core, but it contains some of the biggest changes or additions out of all the revisions.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
I said its missing out on a lot of performance. I'd consider even 10%-15% to be a fairly big gain wouldn't you?

Did you buy your Fury's at launch? How much did let's say month or two newer drivers impact Fury performance?
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
You have to make a distinction between Graphics Capabilties, and compute capabilities, @Phynaz.

Previous versions of GCN were the same on Compute capabilities, and differed very marginally(same is for Vega). Graphics pipeline was the same, all of Previous versions of GCN arch. Pixel Engine was tied to Memory controller.

Vega has completely different graphics capabilities, and memory controller. That alone requires reworking the drivers to extract the performance of GPU.

It can run like Fiji. It is not a problem. But you lack all of the differences, that will affect the performance. Which may say why the GPU in current state of gaming drivers behaves like Heavily OC'ed Fiji.

As a side note:
It may also be a bug, somewhere, but it appears that in current state, the software reports only 16 ROPs for Vega architecture.
 
Last edited:
  • Like
Reactions: kawi6rr and Bacon1

french toast

Senior member
Feb 22, 2017
988
825
136
If it is not drivers then what is it? It's as if Nintendo circa 2004-2015 designed the gpu, Moore's law in reverse.
AMD can be stupid, very stupid at times but that usually is marketing or bad rushed decisions based off of poor financial pressure, the engineering is usually top notch, this would be new territory for them.

This must must be an old cobbled together gaming driver soon after they got first silicon back from the lab to test on, I would expect the real driver to bring a good 10℅ or so, is that enough? In my opinion no and even that is wishful thinking but it would help certainly.

I just hope raja has something up his sleeve here.
 

iBoMbY

Member
Nov 23, 2016
175
103
86
Sounds like wishful thinking to me. What performance improvements is it missing, and why do you believe that?

It's possible the Draw-Stream Binning Rasterizer isn't active for example, if there are still problems with that. That could explain a lot. The rasterizer mode can be set at the driver level, as can be seen in one of the Linux patches.
 
  • Like
Reactions: Bacon1 and Yakk

Magic Hate Ball

Senior member
Feb 2, 2017
290
250
96
That was GCN though, all versions of GCN use the same base instructions set, registers, etc.. If Vega is truly new then I don't see how that would work. Perhaps Vega isn't all that new, it's just a revision of GCN then?

I'd view NCU as the X86-64 of AMD's GCN architecture. There's more memory features, some extra compute/render features, but it'll still run the old code on the traditional older architecture... it just has some new stuff bolted on top.
 
May 11, 2008
22,351
1,438
126
This is just a guess since i am not up to date with all relevant information.
However, it would be fun when the driver checks for the date and refuses to enable features until the nda date has passed.
Great way to circumvent people who ignore nda.
And creates wrong impressions about vega's capabilities and performance as a deflection which would be a great marketing stunt if they did something similar as when ryzen was released( 40% >> 52%).
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
If it is not drivers then what is it? It's as if Nintendo circa 2004-2015 designed the gpu, Moore's law in reverse.
AMD can be stupid, very stupid at times but that usually is marketing or bad rushed decisions based off of poor financial pressure, the engineering is usually top notch, this would be new territory for them.

This must must be an old cobbled together gaming driver soon after they got first silicon back from the lab to test on, I would expect the real driver to bring a good 10℅ or so, is that enough? In my opinion no and even that is wishful thinking but it would help certainly.

I just hope raja has something up his sleeve here.
Other thing as a possibility is that Primitive Shaders are going to increase the throughput of geometry of GPU, but only when applied.

Raja in Q&A have said that Developers do not need to do anything special to use them, but Game Devs are saying that regardless of everything, Applications would have to be rewritten to utilize this feature and increase the performance of this GPU.

It may appear that without Primitive shaders, geometry throughput is the same as Fiji, but with solved balancing of the load, etc.

I only theorize about the reasons and possibilities, why the GPU is underperforming.

Edit: PCPer got their Frontier Edition: https://www.pcper.com/live/
 
  • Like
Reactions: Bacon1

french toast

Senior member
Feb 22, 2017
988
825
136
Other thing as a possibility is that Primitive Shaders are going to increase the throughput of geometry of GPU, but only when applied.

Raja in Q&A have said that Developers do not need to do anything special to use them, but Game Devs are saying that regardless of everything, Applications would have to be rewritten to utilize this feature and increase the performance of this GPU.

It may appear that without Primitive shaders, geometry throughput is the same as Fiji, but with solved balancing of the load, etc.

I only theorize about the reasons and possibilities, why the GPU is underperforming.

Edit: PCPer got their Frontier Edition: https://www.pcper.com/live/
I think they should have increased the shader engines and rops, perhaps gone with gddr5x which would have helped get the card out sooner I'm sure, but then again I'm no graphics engineer :)
I'm not sure where this leaves gcn too be honest.
 

Snarf Snarf

Senior member
Feb 19, 2015
399
327
136
It seems that in its current state in gaming this card is indeed a turd if we're being completely objective. The only thing more accurate than that statement is that AMD's launches are even more turd-like than the gaming performance of this card.

If I were a betting man, they only released this card to meet the 1H 2017 promise they made at the investors day. At the beginning of this year AMD was talking up Vega and Prey being the first title optimized for this architecture, 6 months later they still don't have a driver that has enabled the major performance improving features (ignore that AMD has a driver that released 2 days ago because it could have flags to disable tile based raster engine, primitive shaders, and geometry pipeline improvements). All signs point to lack of man power on the software side, clocks are up just like AMD said would happen, and they also claimed improved IPC from Polaris and Fiji but that hasn't been proven yet. If you've been following everything about this launch Raja has said at least 3 different times that his software team was having fits rewriting things for Vega, and its pretty apparent by this card running like a 1600 MHz Fury that 0 of the uarch improvements on this card are working right now.

Which software side fix is going to bring out Vega's true performance? Is this driver side problems, or are these changes so drastic that games will need to have a Vega patch to take advantage of the new architectural improvements a la Ashes on Ryzen?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Thats why AMD was very stupid to not give the card to professional websites. We knew that some random dude will get the card first and publish a hurry up number for what will become the best day of his life !
At least with reputable websites, you can damage control the situation with reviewer guide and direct support from AMD.
But no, all what we will remember from Vega launch is a single firestrike number with the catastrophic conclusion in the mind of average Joe: "Meh... Vega is barely matching a 1080"
But why AMD ? :(:confused::rolleyes:
Actually almost nobody will ever be aware of some random nobody posting numbers. Those who are, if they are people who this card is designed for, won't pay it too much credence and will wait for pro results.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
The vega RX game cards are still not released and information is sparse.
AMD has already told us that there will be more information and release of RX Vega at SIGGRAPH, which is at the end of July. I assume that reviewers will be getting RX Vega samples shortly before or at SIGGRAPH.
 
Status
Not open for further replies.