AMD Radeon RX Vega 64 and 56 Reviews [*UPDATED* Aug 28]

Page 32 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Elixer

Lifer
May 7, 2002
10,371
762
126
Well Vega 56 isn't compelling enough against the GTX 1070 based on the criteria you've laid out. It isn't a better performer, the price isn't competitive - mining craze notwithstanding, and it consumes a heck of a lot power to merely trade blows with the GTX 1070.
Actually, V56 is faster than the 1070 pretty much across the board. V64 trades blows with the 1080.

This is why most review sites gave V56 "editor's choice" or "gold" or whatever else they want to call it.
anandtech review said:
Vega 56’s power consumption also looks better than Vega 64’s, thanks to binning and its lower clockspeeds. Its power consumption is still notably worse than the GTX 1070’s by anywhere between 45W and 75W at the wall, but on both a relative basis and an absolute basis, it’s at least closer. Consequently, just how well the Vega 56 fares depends on your views on power consumption. It’s faster than the GTX 1070, and even if retail prices are just similar to the GTX 1070 rather than cheaper, then for some buyers looking to maximize performance for their dollar, that will be enough.

The only thing it has against it is that power draw of 45-75W more than the 1070, and that isn't really on the minds of most people when they are going out looking for video cards. They just want something that is best bang for the $$$, and at $399, the V56 was.
 

stockolicious

Member
Jun 5, 2017
80
59
61
NVidia pretty much owns the datacenter GPU HPC market, and it's only about 10% of NVidia revenues, so that isn't where the money is (yet). A big part of this market is AI/Machine learning, note the Tensor processor units on NVidia GV100. I doubt simply having more compute performance on Vega is really going to translate into increased market share against NVidia in Datacenter HPC.

It doesn't seem very wise to deprecate a market where the vast majority of GPU revenue comes from, to chase one, where that is 1/10th the size that you have no presence in.

Unless by compute, you mean coin miners. That market really makes no sense as a priority, as it can bust in no time.

"It doesn't seem very wise to deprecate a market where the vast majority of GPU revenue comes from, to chase one, where that is 1/10th the size that you have no presence in"

they are not deprecating a market - they are just reading the writing on the wall. - and those GPU revenues never made them 1$ in profit. They will still grow share due to Ryzen and attach rate with that CPU. The important point is they are doing the right thing - they built Ryzen for datacenters and Servers first, not for gamers because? that is where the margin is. They are now using Ryzen's leverage to push into the more lucrative datacenter because they can supply the CPU and GPU. Vega is not an attempt at a great gaming card - everyone knows that. They are just trying to make "margin" on some GPU products which they have not in the past.

"Unless by compute, you mean coin miners"

No- I meant that that graphics cards are not dedicated to gaming like they have been at some time in the past. Things are moving very quickly for GPU's these days. You mentioned some of those things above.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
The only thing it has against it is that power draw of 45-75W more than the 1070, and that isn't really on the minds of most people when they are going out looking for video cards. They just want something that is best bang for the $$$, and at $399, the V56 was.

I would say "trades blows" is a more realistic assesment of Vega 56 as well. It really depends on selection of games. Heck even Tech Report said the Vega 56 was even with the GTX 1070:
http://techreport.com/review/32391/amd-radeon-rx-vega-64-and-rx-vega-56-graphics-cards-reviewed/12

And the guy who owns Tech Report, works for AMD.

That $399 price for Vega 56, is no more likely to mean anything, than the official MSRP of $349 for the GTX1070.

If we are going by MSRP, I would say the GTX 1070 is the better deal.

When you can actually buy a Vega 56, then we can compare street pricing.
 
  • Like
Reactions: french toast

IRobot23

Senior member
Jul 3, 2017
601
183
76
Man, Vega is a failure for gaming, too large die, too much power and to slow.

For other stuff is great, but gaming... failure.The end!

Why is so hard to admit? Just like Intel fanboys showing of with their ST performance, but they know that many games in mp scale very well with more threads...
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Well Vega 56 isn't compelling enough against the GTX 1070 based on the criteria you've laid out. It isn't a better performer, the price isn't competitive - mining craze notwithstanding, and it consumes a heck of a lot power to merely trade blows with the GTX 1070.

Every review i have read, VEGA 56 is faster than GTX1070.

http://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review

http://www.eurogamer.net/articles/digitalfoundry-2017-amd-radeon-rx-vega-56-review

https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_56/31.html

https://www.techspot.com/review/1468-amd-radeon-rx-vega-56/

And power is only 40-50W more (Chill enabled) for a desktop GPU which many will not even care. Im not saying power consumption doesnt matter, im saying that most people doesnt care for 50W more on a desktop gaming system to be the first criteria of purchase.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
That is incorrect. He left for AMD, no connection to Tech Report anymore.
http://techreport.com/blog/29390/into-a-new-era

My bad, I should have looked into it deeper. But the point remains, that it is not a universal that Vega 56 beats 1070. It really depends on the choice of games.


That is why I said was.
I have no idea if it still will be that price.

No you said it wins bang/buck and had no statement about the future.
 

Elfear

Diamond Member
May 30, 2004
7,169
829
126
Man, Vega is a failure for gaming, too large die, too much power and to slow.

For other stuff is great, but gaming... failure.The end!

Why is so hard to admit? Just like Intel fanboys showing of with their ST performance, but they know that many games in mp scale very well with more threads...

I admit I was disappointed in the final performance of Vega but I don't see it as a failure. My next monitor WILL have some adaptive sync option, so if my choices are Vega + Freesync or 1070/1080 + Gsync + $300, Vega actually makes a compelling option.
 
  • Like
Reactions: jrphoenix

Elixer

Lifer
May 7, 2002
10,371
762
126

IRobot23

Senior member
Jul 3, 2017
601
183
76
I admit I was disappointed in the final performance of Vega but I don't see it as a failure. My next monitor WILL have some adaptive sync option, so if my choices are Vega + Freesync or 1070/1080 + Gsync + $300, Vega actually makes a compelling option.

I agree, if the price price is right. I was hoping for gtx 1080TI +10-15% gaming performance ...
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
The launch drivers for Vega do not even enable features like primitive shaders or HBCC.

Not sure about Primitive Shaders, but HBCC is already enabled on the drivers. Computerbase.de has a comparison. HBCC increases performance by 5-7%. So disabled it'll perform 5-7% worse. I think its available on AMD Settings as something you can enable or disable.

There was no reason to expect any sort of big gains due to HBCC when it was revealed what it does exactly anyway. It's more for future games where larger worlds would need memory usage in excess of VRAM sizes.

The sad part is without features like HBCC, Vega performs even worse.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I admit I was disappointed in the final performance of Vega but I don't see it as a failure. My next monitor WILL have some adaptive sync option, so if my choices are Vega + Freesync or 1070/1080 + Gsync + $300, Vega actually makes a compelling option.

Judging by reddit posts and some neogaf posts, right now until prices regulate, you're look at barely over $100 for GTX 1080+Gsync monitor versus Vega 64+FreeSync monitor, assuming you're looking at similar monitors. Vega64 prices are godawful. Even Fury X didn't get this heavily gouged.

On the topic of the G-Sync tax, I guess because I got my X34 for <$500, I didn't really look at the price difference. But G-Sync monitors seem to either maintain their price or vendors/manufacturers don't want to budge. These things are still expensive haha. Trying to get the wife a G-Sync monitor, but $200 or more in some cases becomes a hard pill to swallow. Oh well, you get what you pay for. Guess this household is staying in the Nvidia camp for the foreseeable future.

EDIT: Will be interesting to see once the new HDMI spec rolls out. If NV adopts it, the only real argument for buying a Vega right now is basically thrown out the window. Good thing for AMD, that won't happen for at least another year. But if Volta brings it along, woof.
 
  • Like
Reactions: Kuosimodo

tential

Diamond Member
May 13, 2008
7,348
642
121
Every review i have read, VEGA 56 is faster than GTX1070.

http://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review

http://www.eurogamer.net/articles/digitalfoundry-2017-amd-radeon-rx-vega-56-review

https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_56/31.html

https://www.techspot.com/review/1468-amd-radeon-rx-vega-56/

And power is only 40-50W more (Chill enabled) for a desktop GPU which many will not even care. Im not saying power consumption doesnt matter, im saying that most people doesnt care for 50W more on a desktop gaming system to be the first criteria of purchase.
Did the aa issue get cleared up? Why isn't that factoring into how people judge this gpu?
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Judging by reddit posts and some neogaf posts, right now until prices regulate, you're look at barely over $100 for GTX 1080+Gsync monitor versus Vega 64+FreeSync monitor, assuming you're looking at similar monitors. Vega64 prices are godawful. Even Fury X didn't get this heavily gouged.

On the topic of the G-Sync tax, I guess because I got my X34 for <$500, I didn't really look at the price difference. But G-Sync monitors seem to either maintain their price or vendors/manufacturers don't want to budge. These things are still expensive haha. Trying to get the wife a G-Sync monitor, but $200 or more in some cases becomes a hard pill to swallow. Oh well, you get what you pay for. Guess this household is staying in the Nvidia camp for the foreseeable future.

EDIT: Will be interesting to see once the new HDMI spec rolls out. If NV adopts it, the only real argument for buying a Vega right now is basically thrown out the window. Good thing for AMD, that won't happen for at least another year. But if Volta brings it along, woof.
It's not a true tax because gsync and freesync aren't true equivalents. We need to first factor in the fact that gsync is more robust.
That's definitely worth some premium. It can be an insane difference though or a small one. Given the target resolution of most people here of 1440p or less, I'd consider gsync no problem.

The 4k $1000+ monitors another story.
For me it's a $700 premium, the cost of the gpu itself. But since Nvidia is the only person that really offers performance to make my monitor truly enjoyable, is it really a premium as much as Nvidia actually allowing my monitor to be enjoyable vs never getting to play games at high specs at 4k and always having to compromise?


Gsync isn't as much of a tax as people think it is. Nvidia gives you highend performance that is unmatched for long stretches of time. People need to factor in the full situation instead of framing it to make freesync look amazing and gsync to be the worst evil thing to consumers ever.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
It's not a true tax because gsync and freesync aren't true equivalents. We need to first factor in the fact that gsync is more robust.
That's definitely worth some premium. It can be an insane difference though or a small one. Given the target resolution of most people here of 1440p or less, I'd consider gsync no problem.

The 4k $1000+ monitors another story.
For me it's a $700 premium, the cost of the gpu itself. But since Nvidia is the only person that really offers performance to make my monitor truly enjoyable, is it really a premium as much as Nvidia actually allowing my monitor to be enjoyable vs never getting to play games at high specs at 4k and always having to compromise?


Gsync isn't as much of a tax as people think it is. Nvidia gives you highend performance that is unmatched for long stretches of time. People need to factor in the full situation instead of framing it to make freesync look amazing and gsync to be the worst evil thing to consumers ever.

When you factor in your monitor is gonna stay with you a long time, the overall cost isn't that much of a big deal (at least to me.) I spent years playing on a dinky Westinghouse LCD. It cost me like $80 and I was super excited. Then I got my first IPS monitor and realized the bargain bin monitors were just an excuse to save "money." Reading up the reviews on that Samsung monitor AMD originally picked for the promo seemed like the worst possible introduction to FreeSync for new users.

Any ways, $800 today and I know it will last my wife 4-5 years, no biggie. But still, when comparing it to my $450 Acer X34...well I got a good deal and it's throwing a wrench into my reasoning haha.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
When you factor in your monitor is gonna stay with you a long time, the overall cost isn't that much of a big deal (at least to me.) I spent years playing on a dinky Westinghouse LCD. It cost me like $80 and I was super excited. Then I got my first IPS monitor and realized the bargain bin monitors were just an excuse to save "money." Reading up the reviews on that Samsung monitor AMD originally picked for the promo seemed like the worst possible introduction to FreeSync for new users.

Any ways, $800 today and I know it will last my wife 4-5 years, no biggie. But still, when comparing it to my $450 Acer X34...well I got a good deal and it's throwing a wrench into my reasoning haha.
So you're not ready for the $1400+ 4k 144hz hdr gsync monitor?

Haven't even heard of freesync one announced although I can't imagine it'd have as many use cases. Hence why I'm just not too interested in Freesync. It deadends for me right now while Nvidia will continue to progress.
 
Last edited:

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
Did the aa issue get cleared up? Why isn't that factoring into how people judge this gpu?
Factoring how? As in not buying it? Some just can't wait or the money isn't an issue. Amazingly, Anti-aliasing isn't something some people care about. For me it's imperative.

There hasn't been extensive testing yet done of Vega at all considering the short time AMD decided to give the reviewers, let alone extensive testing done of the MSAA issue. I certainly won't purchase one yet and wasn't planning to until at least AIB Vega 56's are out anyway.
 

EXCellR8

Diamond Member
Sep 1, 2010
4,135
945
136
As much as I like AIB models I think I might end up with a ref Vega 64 LE, air cooled. I don't really see how much better the aftermarket could do with Vega but I'm still willing to wait it out and see. If there are any cards worth buying, expect low stock and an additional adopter tax.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
As much as I like AIB models I think I might end up with a ref Vega 64 LE, air cooled. I don't really see how much better the aftermarket could do with Vega but I'm still willing to wait it out and see. If there are any cards worth buying, expect low stock and an additional adopter tax.
Reference coolers are usually considerably worse than aftermarket, usually even just due to blower designs. Case exhaust designs are more efficient.
 

Elixer

Lifer
May 7, 2002
10,371
762
126
As much as I like AIB models I think I might end up with a ref Vega 64 LE, air cooled. I don't really see how much better the aftermarket could do with Vega but I'm still willing to wait it out and see. If there are any cards worth buying, expect low stock and an additional adopter tax.
The blower cards throttle, and the AIB's won't.
It isn't only about the throttling though, it is about the noise level as well, and again, the AIB's will be much quieter also.

About the only reason to get a reference design is, if you absolutely need a card that exhaust the heat out of the case, or, you are gonna do water cooling.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
Not sure about Primitive Shaders, but HBCC is already enabled on the drivers. Computerbase.de has a comparison. HBCC increases performance by 5-7%. So disabled it'll perform 5-7% worse. I think its available on AMD Settings as something you can enable or disable.

It's disabled by default, if I'm not mistaken.

There was no reason to expect any sort of big gains due to HBCC when it was revealed what it does exactly anyway. It's more for future games where larger worlds would need memory usage in excess of VRAM sizes.

I don't think it's designed for gaming at all, at least not at this level. AMD seems to have been pushing HBCC for professional applications that might require very large data sets.

Where the HBCC might really shine is on the Raven Ridge iGPU. If it's able to share the processor's L3 as a high-bandwidth cache, that could lead to real performance gains, much like we saw on Intel's Crystalwell with its EDRAM.
 
Status
Not open for further replies.