AMD Radeon RX Vega 64 and 56 Reviews [*UPDATED* Aug 28]

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
According to ComputerBase.de Primitive Shaders, and High Bandwidth Cache Controller are "inaktiv" in drivers.
High Bandwidth cache still can be switched "on" in drivers.

Quotes, translated, about Primitive Shaders: Problem: This function is also currently disabled in the driver. Vega currently only uses the traditional pipeline. When does this change? AMD does not call an appointment.

About High Bandwidth Cache: Problem: Currently, the HBCC is still switched off by default because the experience gained is not sufficient to activate the HBCC consistently. However, this can be changed in the Radeon Settings menu. Then, on the Radeon RX Vega, you can configure how much memory the High Bandwidth Cache Controller is to use as extended GPU memory - a maximum of 64 gigabytes is possible.

So in the end: it is not functional GPU, yet, with very important features being disabled still. Great release by AMD.

This makes it the worst GPU launch ever. If AMD cannot develop proper driver software for probably their most important GPU launch in 5 years and cannot even utilize the hardware features designed what is the point. One thing is obvious. RTG needs better execution. Nvidia will get to >80% market share once Volta launches.
 

french toast

Senior member
Feb 22, 2017
988
825
136
Come to think of it we are missing one notorious member who relishes times such as these, *********, unlike him to miss such a succulent opportunity as this.

No call outs.

AT Moderator ElFenix
 
Last edited by a moderator:

SpaceBeer

Senior member
Apr 2, 2016
307
100
116
I would like to know what the hell is wrong with it, Something has to be wrong, some fault or major bottleneck.
I think this is what went wrong

2017_08_14_18_27_24_Advanced_Micro_Devices_Resea.png
 

FatherMurphy

Senior member
Mar 27, 2014
229
18
81
Goodness gracious. From the perspective of a high-end gamer, this launch carries a disappointment for the future's prospects. Vega is going to act like an anchor around the neck of the entire high-end gaming segment for the next year and a half to a year.

There is no where to go up for AMD with Vega's gaming performance outside of driver improvements. Sure, unlike Fiji, AMD could scale up Vega with a bigger chip--it isn't reticle limited. And, sure, AMD and its partners can strap on fancy coolers to keep down temps and noise. But AMD has already blown its power budget with this chip and it is only 486mm2. That gets you mid-2016 performance.

Vega does not bring performance or price competition to the high-end. Nvidia need not change the MSRP of the 1080 or the 1080ti. The next time you see price drops on the 1080 or 1080ti, you will know that Nvidia and its partners are trying to clear the channel for Volta. When Volta hits with additional performance, Nvidia will not have any reason to maintain the same prices. Instead, we'll see price tiers for the underlying GPUs shift up again on Nvidia's side. The 2080 will return to $600 - $700. The 2080ti needn't even exist outside a $1200 Titan Volta.

When can AMD get back in the high-end game? Late 2018 to 2019. They need a new process or a new architecture because of the power issues. The new process won't come until 2019 at the earliest, right? And the Navi "architecture," from what I have read, is probably a Vega-like chip scaled down in physical size (chiplet?) but chained together with Infinity Fabric to scale up performance. That's a late 2018 or 2019 product.

That's assuming that AMD even attempts to compete in the ultra-high end segment down the line. Fiji at least took aim at the highest end of the market and did reasonably well, although its pricing did it no favors. Two years later, Vega is a junior varsity player trying to compete on the varsity team.

Look for Nvidia's prices and margins to go up over the next year, ladies and gentlemen.
 

Bouowmx

Golden Member
Nov 13, 2016
1,150
553
146
For the future, AMD has a Vega on 14 nm +, process optimization. Whether anybody would be interested in it idk.

Afterwards is Navi on 7 nm. Predictions on whether it will use the same 4096 cores :eek: ayy
 

french toast

Senior member
Feb 22, 2017
988
825
136
Goodness gracious. From the perspective of a high-end gamer, this launch carries a disappointment for the future's prospects. Vega is going to act like an anchor around the neck of the entire high-end gaming segment for the next year and a half to a year.

There is no where to go up for AMD with Vega's gaming performance outside of driver improvements. Sure, unlike Fiji, AMD could scale up Vega with a bigger chip--it isn't reticle limited. And, sure, AMD and its partners can strap on fancy coolers to keep down temps and noise. But AMD has already blown its power budget with this chip and it is only 486mm2. That gets you mid-2016 performance.

Vega does not bring performance or price competition to the high-end. Nvidia need not change the MSRP of the 1080 or the 1080ti. The next time you see price drops on the 1080 or 1080ti, you will know that Nvidia and its partners are trying to clear the channel for Volta. When Volta hits with additional performance, Nvidia will not have any reason to maintain the same prices. Instead, we'll see price tiers for the underlying GPUs shift up again on Nvidia's side. The 2080 will return to $600 - $700. The 2080ti needn't even exist outside a $1200 Titan Volta.

When can AMD get back in the high-end game? Late 2018 to 2019. They need a new process or a new architecture because of the power issues. The new process won't come until 2019 at the earliest, right? And the Navi "architecture," from what I have read, is probably a Vega-like chip scaled down in physical size (chiplet?) but chained together with Infinity Fabric to scale up performance. That's a late 2018 or 2019 product.

That's assuming that AMD even attempts to compete in the ultra-high end segment down the line. Fiji at least took aim at the highest end of the market and did reasonably well, although its pricing did it no favors. Two years later, Vega is a junior varsity player trying to compete on the varsity team.

Look for Nvidia's prices and margins to go up over the next year, ladies and gentlemen.
Navi is 7nm HP process with scalability and next gen memory.
If we see a Vega like improvement, expect 20℅ more performance and 30℅ more power :p
 

BigDaveX

Senior member
Jun 12, 2014
440
216
116
This makes it the worst GPU launch ever. If AMD cannot develop proper driver software for probably their most important GPU launch in 5 years and cannot even utilize the hardware features designed what is the point.

My guess is that AMD figured that by the time they had all the features of the GPU fully functional in the drivers, Volta would most likely have been released. And if the rumours about that chip are adequate, chances are the performance gap between Pascal and a crippled Vega is smaller than that between Volta and a properly working Vega.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
So sad that out of the box performance-per-watt is worse than 980 Ti. If AMD accepted that Vega was not a proper GTX 1080 competitor, it actual almost looks like decent architecture.

https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/32.html
https://tpucdn.com/reviews/AMD/Radeon_RX_Vega_64/images/power_average.png

Look at the PwrSave mode in the above links. This shows performance-per-watt improvements over Polaris. And it manages to fall in between 980 Ti and 1080 Ti here in terms of efficiency. And these Power Save modes are only 5-10% performance behind the GTX 1080:

https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/31.html

If clocked properly, Vega looks like it's only half a generation behind Nvidia. But then look at those Turbo modes and wow, it falls behind Maxwell in efficiency, becomes a space heater pumping out 300W+, and eeks out single digit performance gains that manage just to tie the 1080. And now all of sudden it looks a generation behind Nvidia in effiency. Ugh. I wonder how max OC 980 Ti compares in wattage and performance. But hey, now it "trades blows" with the 1080 so they can charge more.

This card would have looked a lot better if it were Power Save (second bios) by default and $50 less than the GTX 1080. Little worse performance (only 6% less at 4k on TPU), only moderately more power consumption (34w more gaming average on TPU), and a result it's a little cheaper. But they need to sell these for as high as possible, so raw performance it is.

Anyway, even doing that wouldn't erase the fact that AMD continues to shamelessly regress in performance-per-TFLOP to an extent that makes me want to throw up. But I would have liked some lipstick on the pig. As it stands at current prices, this card cannot be recommended. An inferior card needs to be priced inferior.
 

leoneazzurro

Golden Member
Jul 26, 2016
1,114
1,867
136
I really hoped for the sake of competition (i.e. lower prices for the consumers) that AMD could have improved from Vega FE, unfortunately it seems it has not so much.
DX12 performance is not bad at all, DX11 lags a bit but all in all it is on the 1080 (non Ti) level as AT review shows. But man, their power consumption is really too high. And it make me wonder how is it possible to increase geometry, cache sizes and also clocks by 50%+ and getting not a so big improvement over Fiji. Someone at AMD should tell me where the bottleneck is (memory BW should also have less an impact on Vega due to the new features). Anyway, who does not care about power draw and can use a freesync monitor can buy it, with time it will also improve performance a little being a new architecture and more modern game engines could use the new features. .
Let's only hope that after this AMD can pull out an RV770 again.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Has any of these reviews tried downclocking the memory? I am curious how Vega scales with memory bandwidth and downclocking the HBM might give some insight into if/how Vega is bandwidth limited.

GamerNexus has a nice review with Vega 56 HBM overclocking. They say putting +50% to power limit increases performance by 12% or so. Overclocking HBM2 by a whopping 20% from 800MHz to 950MHz nets in a mere 3.6% increase on top of that.

That suggests its not memory bandwidth limited.
Polaris/Vega/Navi are all named after stars. Stars burn bright so it's just a play on the naming.

It's not just about that. They make themselves sound biased towards AMD by saying that. Read their conclusions though: "In short, the Vega launch has the potential to be AMD’s brightest day."

Please tell me dear writers at Anandtech. How did the performance and power usage numbers correlate with the title? It may be an exaggeration to say its their darkest day, but its certainly not their brightest. Someone was saying somewhere that reviewers are scared that they won't receive samples next time so they are being careful. But this isn't really being careful, its bias.
 

guachi

Senior member
Nov 16, 2010
761
415
136
Is Vega really enough perf for you at 4k?

I rarely game. Most of the games I play my 480 can handle at 4k. It can't do TW:Warhammer or the latest X-Com without reducing settings, but all my other games play fine because they are old.

So, yes, Vega 64 is enough with Freesync. And there's no reason anyone should game without Freesync or GSync. I don't game so much I even need Vega, though (which is why the power numbers don't bother me). But if I can get a non-reference card for $550 or something I'll probably buy one and sell my 480 for whatever I can get.

If nVidia wants to gouge me on GSync my answer to them is "no, thanks".
 

zinfamous

No Lifer
Jul 12, 2006
111,992
31,551
146
How does 56 look good consuming so much more power than 1070?

The power delta between 56 and 1070 isn't as great as that between 64 and 1080, especially when the actual performance comparisons come into play. ...at least looking at the few charts that I noticed. There is clearly something going on when Vega scales up from the 56 to the 64, with a rather meager performance increase but at a drastic power cost.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
Yeah this isn't quite what I hoped for, but I took the plunge anyway cuz im a sucker for sexy reference AIO's and I have a Freesync monitor
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
any review with undervolting ?
One of the first reviews I posted this morning on the first page - from Gamers Nexus using the Vega 56. Pretty good results!

I think that's going to be key for Vega. AIB partners with superior cooling setups (2 or 3 fans, water blocks, etc.) combined with voltage tweaks and ever-maturing drivers will help Vega stay cool and quiet while cranking up the clocks. Already there appears to be some real-world gains of 5-15% over Vega FE, which is quite surprising.

If (big IF) any games get Vega-optimized patches/updates, then who knows. Still not going to close that 30% performance gap for 1080Ti. Then there's Volta... :-/
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
The power delta between 56 and 1070 isn't as great as that between 64 and 1080, especially when the actual performance comparisons come into play. ...at least looking at the few charts that I noticed. There is clearly something going on when Vega scales up from the 56 to the 64, with a rather meager performance increase but at a drastic power cost.

Its very obvious that Vega 64 is clocked beyond the optimal point on the voltage/freq curve for the GF 14LPP process. AMD's problem is that Nvidia's Pascal designs on TSMC 16FF+ are effortlessly hitting 1.6-1.7 Ghz at stock and 2 Ghz overclocked. Pascal is a very efficient and scalable architecture with a highly optimized physical design on the best foundry FINFET process with the best yields (within the foundry industry). This is an unbeatable combination. AMD loses on each and every one of those factors and the nett result is a massive lead for Nvidia in perf, perf/watt , perf/ sq mm.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,730
136
If someone told me a few months ago that an out of the box 14nm FinFET Vega 64 delivers worse performance per watt than a 2.5 years old 28nm GTX 980 Ti, I'd call that person crazy. And the competition is certainly looking at ways to convince those Pascal owners to upgrade in a few months. Based on GV100 FP32 specs, they might achieve another 30-50% perf/watt bump, which could enable GTX 1080 performance levels (hence matching/beating Vega 64) for a mainstream GV106 at 100-125W TDP.
Your math is off. A 40% perf/watt gaming performance improvement would land a 120W GV106 just slightly ahead of a 1070 FE. Fp32 teraflops the way they're calculated are rubbish. You're delusional if you think that the GV100 would actually give you 15Tflops doing an actual single precision computation.
 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
http://radeon.com/_downloads/vega-whitepaper-11.6.17.pdf

Vega Whitepaper. One of quotes from it:
Next-generation geometry engine To meet the needs of both professional graphics and gaming applications, the geometry engines in “Vega” have been tuned for higher polygon throughput by adding new fast paths through the hardware and by avoiding unnecessary processing. This next-generation geometry (NGG) path is much more flexible and programmable than before. To highlight one of the innovations in the new geometry engine, primitive shaders are a key element in its ability to achieve much higher polygon throughput per transistor. Previous hardware mapped quite closely to the standard Direct3D rendering pipeline, with several stages including input assembly, vertex shading, hull shading, tessellation, domain shading, and geometry shading. Given the wide variety of rendering technologies now being implemented by developers, however, including all of these stages isn’t always the most efficient way of doing things. Each stage has various restrictions on inputs and outputs that may have been necessary for earlier GPU designs, but such restrictions aren’t always needed on today’s more flexible hardware.

Interesting read overall.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
I think maybe the Vega 64 should have been canned and those dies used for the more profitable WX & Instinct lines. Also those with issues on the "Burning Bright" tag-line... I agree, "Shooting for the Stars" would have been more appropriate, although burning is something Vega isn't far from doing.

I think the only Vega I will be buying is in APU form, assuming it works as a HTPC chip better than a 1050ti.
 
  • Like
Reactions: tential
Mar 10, 2006
11,715
2,012
126
It's not just about that. They make themselves sound biased towards AMD by saying that. Read their conclusions though: "In short, the Vega launch has the potential to be AMD’s brightest day."

Please tell me dear writers at Anandtech. How did the performance and power usage numbers correlate with the title? It may be an exaggeration to say its their darkest day, but its certainly not their brightest. Someone was saying somewhere that reviewers are scared that they won't receive samples next time so they are being careful. But this isn't really being careful, its bias.

Let me explain to you what I see going on.

Hardware review sites, especially in this day and age (shift to mobile, YouTube, and Facebook gobbling up all the ad revenue), are struggling to remain profitable/viable.

AMD stuff tends to get, by far, the most clicks/views than articles about products from any other component maker by far, and in this industry having reviews ready for launch day is absolutely critical.

Getting to do a launch-day review means that you NEED cooperation from the IHV. Intel will sample sites pretty much no matter what as long as they're big enough, ditto NVIDIA (unless it's some halo product like Titan Xp, but let's be honest -- the people buying this stuff aren't messing around with reviews).

AMD, on the other hand, seems to play nice with you if you agree to play nice back. If you want the hardware, you do what AMD says, you kiss the ring, and you don't get too negative about their products.

Remember when Kyle at HardOCP was harshly critical of AMD/AMD products and then AMD started cutting off his review samples? All of a sudden, once GTX 1060 came out to do battle with Polaris, his site's test suite became narrow/AMD-friendly (see: https://www.hardocp.com/article/2016/07/19/nvidia_geforce_gtx_1060_founders_edition_review).

Then, of course, you saw Kyle recently doing the "blind taste test" with RX Vega ahead of the launch, after going on stage at an AMD event.

Don't think that the other hardware reviewers/review sites haven't noticed what's going on here.

The review sites want the clicks and viewership to feed their businesses, and AMD has a lot of power to help them out. Pure and simple.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I am thinking that once Vega 56 comes out, get one if prices are decent, undervolt it some, and should be great. Its CLEAR that these cards are pushed beyond what they should be. Thats very noticeable when you run it in power saving mode which greatly increases the Perf:Watt.

Really depends what AIB's charge for their cards. Mining numbers do not look great on it, so maybe it wont have its price destroyed by miners.
 

moonbogg

Lifer
Jan 8, 2011
10,734
3,454
136
I bet someone at AMD got fired for backing themselves into this ridiculous HBM2 corner. What a worthless implementation. What ever the 1070/80 cost right now, these Vega cards should cost $75-$100 less. That should be AMD's pricing rule for these cards.
Let it float $100 below the competition because these cards are hot, slow and hungry. For some reason Vega reminds me of Krusty the Clown. Slow, hot, power hungry, people waited forever and got trolled finally with a joke product at a joke price. That's clown shoes. God, AMD blew it.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
I have to say that NV still retains quite the lead. The 1070 is over a year old, uses old GDDR5, and it's basically tied with the brand new Vega 56, and has a much lower TDP.
 
Status
Not open for further replies.