AMD Radeon RX Vega 64 and 56 Reviews [*UPDATED* Aug 28]

Page 37 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
Look, I'm not going to get into a long drawn out argument over my personal preferences. I use 2xAA and don't require more. I game at 1080p with a 25 inch monitor. I've seen reviews where they turned off MSAA in favor of CSAA and the card ate the 1070 and rivaled the 1080.

I have always preferred downsampling (DSR/VSR) over all the different flavors of AA. Of all the visual improvements to a game that can be made, I think AF has always been most important to me.

EDIT: I don't even use AA when using DSR/VSR, I also don't use any of the smoothing/sharpening options. Currently, I'm using 1600x1200 (3200x2400 DSR) @ 100Hz on my Sony CRT. It's nucking futs.
 
Last edited:

Muhammed

Senior member
Jul 8, 2009
453
199
116
almost no *gaming* reason to buy a 1070 right now.
Well, plenty of reasons, like the scores of DX11 titles where the 1070 trounces the 56. Chief among them are GTA5, Battleground, Ghost Recon Wildlands, Rainbow Siege..etc (even Crysis 3). These are games that people actually play right now.

There are also NVIDIA exclusive features that dwarfs any thing AMD offers right now: Ansel, PhysX, VXAO, HFTS, and the armada of GameWorks effects that only works on NVIDIA hardware. There is also the better performance in VR games, and the better image quality in Multi-Screen setups through the new reprojection technique NVIDIA developed.
I've seen reviews where they turned off MSAA in favor of CSAA and the card ate the 1070 and rivaled the 1080.
That's only in Dirt 4. The game runs best on AMD hardware.
 
Last edited:
  • Like
Reactions: tviceman

Peicy

Member
Feb 19, 2017
28
14
81
Look, I'm not going to get into a long drawn out argument over my personal preferences. I use 2xAA and don't require more. I game at 1080p with a 25 inch monitor.

I've seen reviews where they turned off MSAA in favor of CSAA and the card ate the 1070 and rivaled the 1080.
Your preferences are yours only and not for discussion.

What games are you talking about specifically? Except for Dirt which was already mentioned of course.
 
Last edited:

Muhammed

Senior member
Jul 8, 2009
453
199
116
You're more than welcome to show other titles and reviews to support your claim.
I will do that right after you show me links for your false claim that a V56 is much faster than 1070 using non MSAA methods.

The only example you gave is CMAA, which I confirmed it for you by giving you the only game that has a CMAA implementation, which is Dirt 4.
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
What games are you talking about specifically? Except for Dirt which was already mentioned of course.
Why not show me some reviews done by pro's and you can shut me up? You can show me any review with CMAA (NOT CSAA TapaTalk)

Grid 2
FFXI MMO
Dirt 4
Arma 3
WoW
 

guachi

Senior member
Nov 16, 2010
761
415
136
The first review I read was Hardocp, where Vega's performance was described as subpar ( and MSAA really killed it).
https://www.hardocp.com/article/2017/08/14/amd_radeon_rx_vega_64_video_card_review/17

http://techreport.com/review/32391/amd-radeon-rx-vega-64-and-rx-vega-56-graphics-cards-reviewed/12
Has a clear lead for the 1080 over V64, and tie for the 1070 and V56

http://www.pcgamer.com/the-amd-radeon-rx-vega-56-and-vega-64-review/
Has a clear lead for the 1080 over V64, and tie for the 1070 and V56 (yeah, same as tech report).

Techspot showed a best case 2% win over 1070 with 25 games, given the behavior at techreport and and PCGamer I think they would almost certainly show a bigger win in favor of 1080.
https://www.techspot.com/review/1468-amd-radeon-rx-vega-56/page8.html

Techspot has a nice graph that really explains the state of what is going on:
https://techspot-static-xjzaqowzxaoif5.stackpathdns.com/articles-info/1468/bench/GameComparison.png

Basically big swings depending on which the game favors. So reviews can swing back and forth depending on the included games, but it gets closer to a tie with larger number of games, with overal 1080 winning it's battle, and Vega 56 winning it's contest. Overall it looks like 1080's edge is better to me, but really it's moot, the individual game swings are huge backs and forth, swamping the minuscule average in one direction or the other.

PC gamer doesn't show the individual game scores (that I could see) and all the other results you linked were 1440 scores.

Therefore, nothing you've shown me disproves that at 4k the 1080 and Vega 64 are equal. If you own a 1440 monitor, the results are probably of value. Decent 4k monitors, even with Freesync, are so cheap and so much better than a 1440 monitor for my daily use I'll never go back. If you can afford a Vega 64/1080/1080Ti you can afford a 4k monitor (well, some of the GSync models are quite pricey).
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I have always preferred downsampling (DSR/VSR) over all the different flavors of AA. Of all the visual improvements to a game that can be made, I think AF has always been most important to me.

EDIT: I don't even use AA when using DSR/VSR, I also don't use any of the smoothing/sharpening options. Currently, I'm using 1600x1200 (3200x2400 DSR) @ 100Hz on my Sony CRT. It's nucking futs.
I either use MSAA or VSR/DSR/Downsampling. They are all forms of AA. Those downsampling methods are basically a form SSAA. Whether one or the other is better is game dependent for myself.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
We don't have enough information yet so I would be off on judgement, could be a memory controller issue but hbcc is helping with higher resolution?
Or it might be a geometry or fillrate issue? Unbalanced design? Hardware fault? Or maybe the hardware is as intended and they are waiting for game optimisations and better drivers over a 3 year period? - a long play so to speak as they do not have the resources to do otherwise?

Who knows, I just want them to sort it by navi :)
In a situation where you dont use 8gb i would think a new protocol like hbcc would actually give a slight hit. Its a tradeoff vs other benefits as i can tell?

Lets see how the primitive shaders pan out in time like half a year because to me they simply need to adress the basic efficiency of their building blocks. The bricks is bad. Talking architecture here is wrong focus imo.

It might seem stupid to dump tech on the market when its not ready but to me it seems their basic problem is too new and advanced tech to fast not just the rush. It goes far beyond. And the wrong focus with far to little emphasis on efficiency. Nv learned from fermi. And look at ryzen; a very lean arch while Intel is stuck with avx2 and even 512bit wide fpu vectors bloating registers what not and dragging the entire boat down.
When i lisnt to Raja it still seems to me that he want to do all. Because its just a few extra people to adress that huge ai and deep learning market and....well then the beancounters agree. We know the result.

My advice is to set expectations low and realistic.
 

Peicy

Member
Feb 19, 2017
28
14
81
Why not show me some reviews done by pro's and you can shut me up? You can show me any review with CMAA (NOT CSAA TapaTalk)

Grid 2
FFXI MMO
Dirt 4
Arma 3
WoW
I don't intend to shut you up.
Please back up your claim by linking to a couple of reviews comparing msaa and cmaa (in a previous post you wrote csaa) on Vega. Prove it.
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
I either use MSAA or VSR/DSR/Downsampling. They are all forms of AA. Those downsampling methods are basically a form SSAA. Whether one or the other is better is game dependent for myself.
No, it's not perfect. The only downsides I have ever experienced are performance hits (fixed with more GPU performance) and GUI shrinkage (game dependent).

There's certainly room for subjective preference, so don't take this as an attack, but I think downsampling is objectively better across the board. Whatever you want to call it (downsampling/SSAA/FSAA before it was DSR/VSR), the results don't adversely affect textures, meshes, transparencies, etc. Likewise, games that have bad AA options or no AA options always work with downsampling. Most other AA methods take a low resolution image (or parts of it) and just make it blurrier or create pixels that otherwise should not be in the scene (artifacts). Same concept with upscaling DVDs and other videos. Trying to manifest/interpolate detail where it doesn't exist always looks bad/incorrect. To get the best image with limited artifacts, it's always best to start with the highest possible detail/resolution and then sample it down to your native resolution. This concept is how everything else in life works - from vision to photography to audio recordings.
 
  • Like
Reactions: Kuosimodo

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
Before you guys go any further in the debate, I just want to point out that it's CMAA, not CSAA, that's at the root of the performance delta. I don't think CSAA is a thing (yet).
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
No, it's not perfect. The only downsides I have ever experienced are performance hits (fixed with more GPU performance) and GUI shrinkage (game dependent).

There's certainly room for subjective preference, so don't take this as an attack, but I think downsampling is objectively better across the board. Whatever you want to call it (downsampling/SSAA/FSAA before it was DSR/VSR), the results don't adversely affect textures, meshes, transparencies, etc. Likewise, games that have bad AA options or no AA options always work with downsampling. Most other AA methods take a low resolution image (or parts of it) and just make it blurrier or create pixels that otherwise should not be in the scene (artifacts). Same concept with upscaling DVDs and other videos. Trying to manifest/interpolate detail where it doesn't exist always looks bad/incorrect. To get the best image with limited artifacts, it's always best to start with the highest possible detail/resolution and then sample it down to your native resolution. This concept is how everything else in life works - from vision to photography to audio recordings.

The orginal AA SSAA was not literally running the game at high resolution and downsampling, though that is a common misconception.

SSAA - was just just multisampling all the pixels the screen to improve quality. It was the gold standard and very big performance hit.
MSAA - Is the optimized version of the above, attempting to concentrate edges, visible pixels, it became the new gold standard at a lesser performance hit, but still quite significant.

DSR/VSR is literally running the game at high resolution and downsampling. It's nice but VERY big performance hit, only suitable for lower resolution montior as you run 1440p or 4K for best results on 1080p monitor. So basically you need 1080Ti to run with DSR fully enabled at 1080p, because the performance hit is the same as running at 4K.

Of these MSAA is IMO, the best bang for the buck.

After that, there is endless supply of post processing filters that just blur everything. They are SHIT IMO.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
WoW is useless to benchmark. It's completely single thread draw call submission limited and relies solely on CPU IPC and DX11 driver optimization. My poor Ryzen is just bored trying to do anything in the game. Anti-aliasing is almost free on modern GPUs as a result.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
Of these MSAA is IMO, the best bang for the buck.
IMO temporal anti-aliasing methods are what should be focused on as they provide the best output over a series of frames and works really well with other post filters. It's also required for techniques like checkerboard rendering which should hopefully get used in PC space more soon.
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
I don't intend to shut you up.
Please back up your claim by linking to a couple of reviews comparing msaa and cmaa (in a previous post you wrote csaa) on Vega. Prove it.

I'm glad you don't intend on shutting up (never asked you to), and I don't really care what your preference is for an AA solution. I prefer CMAA, and AA testing proves CMAA is faster, and Dirt shows it.
 

EXCellR8

Diamond Member
Sep 1, 2010
4,133
943
136
64 stock starting to trickle back in but pricing is still all over the place... yessh

Sapphire base model available on Newegg but it's almost 200 dollars over MSRP.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
PC gamer doesn't show the individual game scores (that I could see) and all the other results you linked were 1440 scores.

Therefore, nothing you've shown me disproves that at 4k the 1080 and Vega 64 are equal. If you own a 1440 monitor, the results are probably of value. Decent 4k monitors, even with Freesync, are so cheap and so much better than a 1440 monitor for my daily use I'll never go back. If you can afford a Vega 64/1080/1080Ti you can afford a 4k monitor (well, some of the GSync models are quite pricey).


PC gamer has individual game scores, though why would you need those, except to cherry pick, to represent a biased point of view? They also tested 4K across the board. As did Techspot.

HardOCP was the only site without any 4K results, as they aim for "Highest Playable Settings". IOW none of the cards is really suitable for 4K, which is true.

Tech Report only tested GTA at 4K. Here is what they said:
"I had hoped moving to 4K with GTA V would help show the virtues of the Vega cards a bit better versus the competition, but that turned out not to be the case. The GTX 1080 is the only card of this bunch that delivers what I would deem a playable 4K experience on a traditional 4K monitor. "
 

Magic Hate Ball

Senior member
Feb 2, 2017
290
250
96
PC gamer has individual game scores, though why would you need those, except to cherry pick, to represent a biased point of view? They also tested 4K across the board. As did Techspot.

HardOCP was the only site without any 4K results, as they aim for "Highest Playable Settings". IOW none of the cards is really suitable for 4K, which is true.

Tech Report only tested GTA at 4K. Here is what they said:
"I had hoped moving to 4K with GTA V would help show the virtues of the Vega cards a bit better versus the competition, but that turned out not to be the case. The GTX 1080 is the only card of this bunch that delivers what I would deem a playable 4K experience on a traditional 4K monitor. "
Yeah, 4k is still a fever dream at max settings in my opinion.

Next generation, maybe? I want closer to 100fps than 40-60fps in my games.

Then again, I mostly play FPS games.
 
  • Like
Reactions: Kuosimodo

guachi

Senior member
Nov 16, 2010
761
415
136
PC gamer has individual game scores, though why would you need those, except to cherry pick, to represent a biased point of view? They also tested 4K across the board. As did Techspot.

HardOCP was the only site without any 4K results, as they aim for "Highest Playable Settings". IOW none of the cards is really suitable for 4K, which is true.

Tech Report only tested GTA at 4K. Here is what they said:
"I had hoped moving to 4K with GTA V would help show the virtues of the Vega cards a bit better versus the competition, but that turned out not to be the case. The GTX 1080 is the only card of this bunch that delivers what I would deem a playable 4K experience on a traditional 4K monitor. "

I like seeing individual scores to see how each card performs in certain games and to calculate what I believe is a better overall average. That is, straight average isn't as useful. I like the sites that show the relative difference (using %) instead of absolute difference. And then use that relative difference to calculate the average difference between the cards. Also, you can see if differences in any particular game enter the realm of "irrelevant". Like, you have a 60Hz monitor and the 99.9th% is over that for the cards being compared then the difference is "irrelevant" even if one card gets 61 fps and the other 101.

Also, because I've seen individual scores, I already know that GTA V is typically the weakest or one of the weakest games for AMD cards. It's about 20% faster on the 1080. Tech Report getting the scores they did is completely unsurprising. Even so, I think (I'd have to look again) that even the sites that showed the 64 equal to the 1080 tested GTA V so they had a worse case game in the mix.

With Freesync (or Gsync) you often don't (or never depending on the games you like) need high fps. As long as you can keep the game off the Freesync floor (40 fps) you can game at 4k just fine. The question for me as a user is "what settings do I need to get 40+ fps at the 99th percentile?"
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
With Freesync (or Gsync) you often don't (or never depending on the games you like) need high fps. As long as you can keep the game off the Freesync floor (40 fps) you can game at 4k just fine. The question for me as a user is "what settings do I need to get 40+ fps at the 99th percentile?"

Freesync does seem to be the main rationalization for going with AMD.
 
  • Like
Reactions: tviceman

tential

Diamond Member
May 13, 2008
7,348
642
121
Is it acceptable yet to complain about how amd handles this?
So many people defended those bundles on here no matter how bad the info got on how it was being handled
 
  • Like
Reactions: tviceman

tential

Diamond Member
May 13, 2008
7,348
642
121
Is it acceptable yet to complain about how amd handles this?
So many people defended those bundles on here no matter how bad the info got on how it was being handled
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
I love how some people are still playing the wait for xxx with AMD here.

Two things i see posted alot not just on this forum but in general regarding this vega launch.

1. Wait for drivers! They need to fix primitive shaders, or this or that instruction is not enabled and will add xx% performance overnight, etc.

2. The prices are inflated due to miners! and the miner boom is about to crash so prices will stabilize soon.

The issue with the "wait for AMD" line is that this time they really do not have the time to wait. They have now managed to get so far behind Nvidia they are so close to a full generation behind. They have exactly till Volta is on shelves to sell any Vega cards, because after that they will need to drop prices below their own costs to sell any cards at all, to gamers anyways who knows what mining will do.

Think about this for a second, Nvidia has consistently released new series with about a 1 model level up performance gain, as in new generation xx60 model will be as fast or faster than last gens xx70 model, and new gen xx70 will be as fast as last gens xx80, etc. If this holds true for Volta, and they release a 2060 thats as fast or faster than a 1070, and keep the xx60 model level pricing this is going to beat a vega 56 for under $300. And the 2070 is going to beat a Vega 64 for under $400. Likely with less power usage unless Nvidia pulls another fermi. This is what AMD fans seem to not realize, Nvidia is ready with their next gen they are not standing still, its going to come out in the next 6-8 months. And its going to crush any chance AMD has to make money off vega. The time for Vega to shine is now, they need to get prices under control now, they need to get the driver situation(if there even is a major issue with drivers) figured out now. And they need to release a fix for the MSAA low performance situation now. They very simply do not have the time to wait.

I was holding out for Vega, i had hope AMD would not release a complete flop, Vega just sounded too good on paper, and the rx480/580 did great vs Nvidia midrange so had hope AMD could match that on the high end. And i really didnt want to invest into the Gsync monitor tax. I even wrote off alot of the vega leaks as BS because i did not want to believe they could not decisively beat a 1080 over a year later(i was expecting right in the middle of 1080/1080Ti performance, because you know AMD was marketing this as a high end gaming GPU and using words like poor volta, and the fact that the chip is huge and should have more muscle than it does). But its clear to me now that Vega is a flop for gamers looking for a high end GPU, and that AMD is not going to be able to ramp Navi up fast enough to get it out Q1 next year to compete with Volta, so Volta will go unanswered for likely as long as Pascal did, and Navi will be competing against what follows Volta.

Poor volta my ass Pascal handily destroys vega with top 2 models being untouched(1080Ti/titan) and the 1080 as well really when you take power and MSAA into consideration. Volta has nothing to worry about.
 
Status
Not open for further replies.