• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

AMD Radeon RX Vega 64 and 56 Reviews [*UPDATED* Aug 28]

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
wow it's basically on same level than fury x what a huge failure, one could think it's just a rebadge !

it's beyond understanding !
 
Let me explain to you what I see going on.

Hardware review sites, especially in this day and age (shift to mobile, YouTube, and Facebook gobbling up all the ad revenue), are struggling to remain profitable/viable.

AMD stuff tends to get, by far, the most clicks/views than articles about products from any other component maker by far, and in this industry having reviews ready for launch day is absolutely critical.

Thanks. It makes sense. Any News sites ride on controversy or anything that doesn't usually happen. A small company that has been behind for years but seems to have potential provides that.

I did notice this "bias" or whatever you want to call it. But they do not say it outright, probably has to do with another fad needing to be politically correct or something.

It doesn't make it correct to do so, money doesn't factor into this.
 
Let me explain to you what I see going on.

Hardware review sites, especially in this day and age (shift to mobile, YouTube, and Facebook gobbling up all the ad revenue), are struggling to remain profitable/viable.

AMD stuff tends to get, by far, the most clicks/views than articles about products from any other component maker by far, and in this industry having reviews ready for launch day is absolutely critical.

Getting to do a launch-day review means that you NEED cooperation from the IHV. Intel will sample sites pretty much no matter what as long as they're big enough, ditto NVIDIA (unless it's some halo product like Titan Xp, but let's be honest -- the people buying this stuff aren't messing around with reviews).

AMD, on the other hand, seems to play nice with you if you agree to play nice back. If you want the hardware, you do what AMD says, you kiss the ring, and you don't get too negative about their products.

Remember when Kyle at HardOCP was harshly critical of AMD/AMD products and then AMD started cutting off his review samples? All of a sudden, once GTX 1060 came out to do battle with Polaris, his site's test suite became narrow/AMD-friendly (see: https://www.hardocp.com/article/2016/07/19/nvidia_geforce_gtx_1060_founders_edition_review).

Then, of course, you saw Kyle recently doing the "blind taste test" with RX Vega ahead of the launch, after going on stage at an AMD event.

Don't think that the other hardware reviewers/review sites haven't noticed what's going on here.

The review sites want the clicks and viewership to feed their businesses, and AMD has a lot of power to help them out. Pure and simple.
Or if you want to say this in a lot less words :
Reviewers don't want to lose their early access review samples.

Considering you could purchase all relevant gpus and cpus for $5-10k a year I just don't get it.

It's why I wanted to review gpus but I'm just not the person for it. I do want to get into YouTube reviews though. With Vega no longer being an option I may get the rx 100 mark 5 (or wait for Mark 6) and start my YouTube adventure.
 
Of the many questions I have, I keep returning to a design choice made by AMD. AMD has told that its transistor budget was blown mostly on ramping up clock speeds. In the reviews, it is clear that AMD has pushed the clock speeds on Vega 64 further down the voltage/clock curve than the architecture/foundry process combination would like to go.

Did AMD expect more from GF's process (e.g. lower voltages at x clock)?
Was AMD not able to ramp up the clock speeds as a high as they projected?
Why not build a lower clocked but wider GPU?

Knowing the electrical properties of the 14 LPP process and GCN's stubbornness with higher clock speeds, it just seems that AMD was perhaps too ambitious to seek clock speeds > 50% over Fiji.

Makes me wonder if there was (1) a miscalculation in the design/simulations at the beginning of the process or (2) a failure in implementation or both.

I'd love a retrospective story on AT similar to what's been done (a long time) before.
 
Well, guess the wait to see what AIBs can do but those aren't even expected until freaking September. Fury didn't do this bad and AMD ditched that name. I can only imagine how they'll try to distance themselves from Vega.

Perhaps they chose to call it Vega so they can can it. They knew it wasn't going to meet expectations so they used the code name as the official branding so it doesn't tarnish the Radeon brand. AMD may be ahead of the curve on this one.
 
The review sites want the clicks and viewership to feed their businesses, and AMD has a lot of power to help them out. Pure and simple.

Hardware.fr changed most of their gaming benchmark suite in favor of Gaming Evolved titles and Vega 64 still didn't beat GTX 1080, though Vega 56 had a minor lead over GTX 1070. Basically the same performance per watt as Fiji as well, despite being a 14nm FinFET product - in line with TPU's review.

- GTX 1080 Ti Review

Battlefield 4
Crysis 3
DiRT Rally
DOOM
Dying Light
Fallout 4
Far Cry Primal
Ghost Recon Wildlands
Grand Theft Auto V
Hitman
Project Cars
Rise of the Tomb Raider
Star Wars Battlefront
The Division
The Witcher 3 Wild Hunt

- Vega Review

Battlefield 1
Hitman
Rise of the Tomb Raider
Ashes of the Singularity
Deus Ex : Mankind Evolved
DOOM
Sniper Elite 4
Civilization VI
TotalWar Warhammer
 
Of the many questions I have, I keep returning to a design choice made by AMD. AMD has told that its transistor budget was blown mostly on ramping up clock speeds. In the reviews, it is clear that AMD has pushed the clock speeds on Vega 64 further down the voltage/clock curve than the architecture/foundry process combination would like to go.

Did AMD expect more from GF's process (e.g. lower voltages at x clock)?
Was AMD not able to ramp up the clock speeds as a high as they projected?
Why not build a lower clocked but wider GPU?

I think its much simpler than that. Halo products provide the money. If pushing for the last 5% makes it go from 2nd place to 1st, they will do it. Most people are saying its ok because power consumption doesn't matter and Vega 56 provides performance at low enough cost.

AMD likely does it for CPUs. That's why their products have no headroom for overclocking. Plus with smaller market share its more likely you can afford to do so. Less volume means you can spec it closer to maximum without getting bunch of calls for RMAs.
 
I stopped picking up Nvidia cards when I couldn't afford them, and they'd have very little gain over the previous generation. Kinda sad about how now AMD has a fantastic card, but they are already out of stock. I guess its good, cause then I can save money. 🙁 On the other hand, I want shiny new tech to go with my Ryzen setup. Probably should wait for the partners to come out with their versions too however.
 
I bet someone at AMD got fired for backing themselves into this ridiculous HBM2 corner. What a worthless implementation. What ever the 1070/80 cost right now, these Vega cards should cost $75-$100 less. That should be AMD's pricing rule for these cards.
Let it float $100 below the competition because these cards are hot, slow and hungry. For some reason Vega reminds me of Krusty the Clown. Slow, hot, power hungry, people waited forever and got trolled finally with a joke product at a joke price. That's clown shoes. God, AMD blew it.

😀
I really want to know Who is responsible? China Team ?
 
Hardware.fr changed most of their gaming benchmark suite in favor of Gaming Evolved titles and Vega 64 still didn't beat GTX 1080, though Vega 56 had a minor lead over GTX 1070. Basically the same performance per watt as Fiji as well, despite being a 14nm FinFET product - in line with TPU's review.

- GTX 1080 Ti Review

Battlefield 4
Crysis 3
DiRT Rally
DOOM
Dying Light
Fallout 4
Far Cry Primal
Ghost Recon Wildlands
Grand Theft Auto V
Hitman
Project Cars
Rise of the Tomb Raider
Star Wars Battlefront
The Division
The Witcher 3 Wild Hunt

- Vega Review

Battlefield 1
Hitman
Rise of the Tomb Raider
Ashes of the Singularity
Deus Ex : Mankind Evolved
DOOM
Sniper Elite 4
Civilization VI
TotalWar Warhammer
You mean that same hardware.fr who along with the rest of Europe, didn't get sampled with Skylake-X by Intel?
Hardware.fr changed most of their gaming benchmark suite in favor of Gaming Evolved titles(which I believe was to show AMD in a better light, but is something I cannot prove) and Vega 64 still didn't beat GTX 1080, though Vega 56 had a minor lead over GTX 1070.
 
I think its much simpler than that. Halo products provide the money. If pushing for the last 5% makes it go from 2nd place to 1st, they will do it. Most people are saying its ok because power consumption doesn't matter and Vega 56 provides performance at low enough cost.

AMD likely does it for CPUs. That's why their products have no headroom for overclocking. Plus with smaller market share its more likely you can afford to do so. Less volume means you can spec it closer to maximum without getting bunch of calls for RMAs.

Yeah, I agree that there is a value in pushing the cards hard for the halo effect. But (1) I don't think that AMD targeted 1080-level performance when they designed Vega or, another way to put it, I don't think AMD planned for Vega to be merely 30% faster than two year old Fiji and (2) I don't think AMD wanted to bring out a product for which they had no choice but to push balls-to-the-wall on the power limit right out off the bat.

These factors lead me to believe that AMD targeted a more efficient, higher performing part but fell short by 20-30% and 50 to 100W.
 
Or if you want to say this in a lot less words :
Reviewers don't want to lose their early access review samples.

Considering you could purchase all relevant gpus and cpus for $5-10k a year I just don't get it.

It's why I wanted to review gpus but I'm just not the person for it. I do want to get into YouTube reviews though. With Vega no longer being an option I may get the rx 100 mark 5 (or wait for Mark 6) and start my YouTube adventure.

It's not about the $ so much as it is about launch day reviews.

If you don't get your review out on Day 1, nobody's going to read it.
 
Why not build a lower clocked but wider GPU?

That doesn't really work, especially nowadays.

One, everyone else would have thought of that idea, which includes your competitor. Second, CPUs and GPUs can't scale much down in voltage anymore. You can, but it results in exponential degradation in clock speeds. So not only you may not gain anything by a wider design lowering clocks and voltage, but you may even lose efficiency by doing so. The key is always to get the right balance.
 
Asking $699 for an Gtx1080 performance equivalent plus 200W more power consumption is just insane...Not even $450 would cut it...it should be $400 and then someone might pick it over a Gtx1080 which costs $499....
 
These factors lead me to believe that AMD targeted a more efficient, higher performing part but fell short by 20-30% and 50 to 100W.

Simply, they messed up. Everything else is detail that we may never know. When it happens though many undesirable things like this happen. They might have found out a year ago that performance is not up to par, which would have caused the team to push out revisions of silicon to get higher clocks. And result in much higher power consumption.

But saying that Vega can be clocked lower and save lot of power is a crutch. Polaris can do that. Pascal can do that. Fury can do that. They released non-X Fury with minimal performance loss and Nano with small performance loss. Clearly its because they felt they needed to get that extra bit.

Actually if they had only Vega 56 or they clocked Vega 64 lower they might have ended up with a perception that Vega is a GTX 1070 competitor. Extra 10% allows them to price it $100 higher, and have one that costs further $100 more and say "look at us, we have WC cards!".

hopefully vega will get better with time, it's usually the case with amd

As a business that absolutely does not make sense. Video cards have a short life span. Highest revenue is by front loading sales. It would make sense for a #1 company, because they can say even next gen competitor's cards are barely at the level of our previous gen cards since "Fine Wine".
 
In my opinion AMD needs to revert back to the small-die strategy that made the RV770 possible back in the day and make perf/watt their biggest priority. GCN blown up to the size of Fiji and Vega scales very poorly as we have seen from their respective performance. The problem is that the people who built the RV770 - Carrell Killebrew et al., have been laid off. I'm not hopeful that the current RTG leadership can execute with respect to these parameters in a timely fashion.
 
In my opinion AMD needs to revert back to the small-die strategy that made the RV770 possible back in the day and make perf/watt their biggest priority. GCN blown up to the size of Fiji and Vega scales very poorly as we have seen from their respective performance. The problem is that the people who built the RV770 - Carrell Killebrew et al., have been laid off. I'm not hopeful that the current RTG leadership can execute with respect to these parameters in a timely fashion.

AMD have to do what they did with Zen. Design a highly efficient architecture and die and use multi die with Infinity Fabric to scale up the product stack. The key here is that the basic die (like Zeppelin) must be very competitive in terms of perf/watt and perf/sq mm.
 
AMD have to do what they did with Zen. Design a highly efficient architecture and die and use multi die with Infinity Fabric to scale up the product stack. The key here is that the basic die (like Zeppelin) must be very competitive in terms of perf/watt and perf/sq mm.
There are a lot of issues with making a NUMA-like GPU work. AMD needs to set specific performance targets and build at a specific die size to ensure the best harvesting capability while aiming for GTX x80 level of performance. 4K@60FPS isn't a target. Read Anand's articles on RV770 and see for yourself what execution means.
 
AMD have to do what they did with Zen. Design a highly efficient architecture and die and use multi die with Infinity Fabric to scale up the product stack. The key here is that the basic die (like Zeppelin) must be very competitive in terms of perf/watt and perf/sq mm.

Breaking a GPU into multiple chips, really only benefits chip yield, it would actually increase power consumption and decrease performance so this really isn't a performance solution, it's a chip yield solution, and the problem here is NOT chip yield.
 
Status
Not open for further replies.
Back
Top