AMD Vega (FE and RX) Benchmarks [Updated Aug 10 - RX Vega 64 Unboxing]

Page 59 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
Vega doesn't hash great, is terribly inefficient and will cost more than more effective alternatives. That and mining profitability has been and will be dropping month after month. We have 18 months of GPU mining tops.
ETH again skyrocket lately.If vega have 30-35MH/s and cost 400USD then miners with cheap electricity will buy out vega and we wont see single vega in stores half year minimum.Probably we wont see it at all because HBM2 shortages...
 

Elixer

Lifer
May 7, 2002
10,371
762
126
Vega need more memory bandwidth not Rops or IPC.Btw if miners will start using vega we wont see vega in stores next 6-12months probably.
This mining craze wont crash this time.It will be here probably forever.
Well, actually all three of those are needed.
The memory compression seems to be not enough.

It looks like almost the same specs as the Fury, except they added a few things (that aren't needed by today's AAA engines), neutered the bandwidth, gone out of specs for HBM2 voltages and have another terrible marketing campaign along side Vega.

It depends on the type of mining being done. Monero loves Vega. Etherium is going down fast, and it is much harder to compute.
AMD would much rather just sell cards to miners, they aren't really vocal about things like the gaming crowd is, so let them have it at $500 a pop.
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
Well, actually all three of those are needed.
The memory compression seems to be not enough.

It looks like almost the same specs as the Fury, except they added a few things (that aren't needed by today's AAA engines), neutered the bandwidth, gone out of specs for HBM2 voltages and have another terrible marketing campaign along side Vega.

It depends on the type of mining being done. Monero loves Vega. Etherium is going down fast, and it is much harder to compute.
AMD would much rather just sell cards to miners, they aren't really vocal about things like the gaming crowd is, so let them have it at $500 a pop.
yes but ETH really dont go down at all.It gain almost 100% in last few days.From 140USD to 251USD
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
ETH again skyrocket lately.If vega have 30-35MH/s and cost 400USD then miners with cheap electricity will buy out vega and we wont see single vega in stores half year minimum.

At current price ($247) Vega's break even with electricity is six months and $137.

At $300 Vega's break even with electricity is seven months and $199.

At $400 Vega's break even with electricity is nine months and $333.

At $500 Vega's break even with electricity is eleven months and $444.

At $600 Vega's break even with electricity is sixteen months at $658.



All of the above is predicated on starting mining today which won't be the case which means all those numbers will be significantly lower. ETH has to hit $750 or higher to make any sort of compelling argument for Vega mining given the power requirements involved. When total system cost (huge PSU requirements) is factored I just don't see Vega being huge for more than a month unless it can crush ALTs.
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
Well i think china/new zeland miners have electricity almost free and 30-35Mh/s is same as 1080TI but vega will cost almost half.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Well i think china/new zeland miners have electricity almost free and 30-35Mh/s is same as 1080TI but vega will cost almost half.

I fully agree, but that just further reinforces my point. As the big farms with low overhead start coming online mining will be dead in short order.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
If you have solar then all the power issues are out of the way.

Heat and the cost of power supplies says differently. Trust me, I have a basement full of very efficient 1060's and 1070's underclocked/undervolted and that heat is a killer. Little guys with a handful of cards sure, but the big guys that are purchasing in bulk won't want Vega when Polaris and 10XX cards are superior.
 

Elixer

Lifer
May 7, 2002
10,371
762
126
If you have solar then all the power issues are out of the way.
You would have to have an insane panel farm (and batteries) for that, to generate the amount of power needed.

This is getting way, way OT though.

For the comment about "You wont be able to tell the difference but one costs 300 dollars more" who said the AMD rep told him that at the tour, if, and that is a HUGE if, the AMD guy was talking about the card price difference, 1080 MSRP: $599, Founders $699 would make the Vega RX $299-$399.
The more likely scenario is, Freesync costs $150 less than Gsync, and Vega RX will cost $450 or $550.
 

Armsdealer

Member
May 10, 2016
181
9
36
1080 MSRP: $599, Founders $699 would make the Vega RX $299-$399.
The more likely scenario is, Freesync costs $150 less than Gsync, and Vega RX will cost $450 or $550.

1080 msrp is actually 500 USD now. There was a price cut of 100 USD on 1080 around 1080 ti launch and 30 USD on 1070. Difficult to find at that price, but that's what it is.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
Some gsync models have a $300 premium over their freesync counterparts so it's highly likely that is the price difference they're referring to. I don't see them pricing RX Vega below $500.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Most of the major titles are both Xbox and Playstation.


It really comes down to something being an AMD or Nvidia based feature and how they push(or don't) adoption. Nvidia is much more aggressive and consistent, while AMD is mostly passive with short campaigns but lacking persistence.

Waiting for dx12 to finally let GCN shine is likely to be wishful thinking. Most dx12/vulcan games aren't going to be doom and will optimize around maxwell/pascal.

AMD not having a console-pc program, giving some level of optimization by default, after all this time is neglect. They haven't leveraged those platforms to any real degree.

Oh, I'm aware they are on both consoles, but if what I've read is accurate, currently because of it's bigger usebase PS4 is the lead console for development (or rather the "target"). Where as during the Xbox360 days it was the reverse. AMD were at the forefront of DX9 with their unified shaders and the huge success of the Xbox360 got DX9 adopted rather quickly (and even cause stagnation).

A lot of hope was placed on DX12/Mantle and it seems they've yet to truly pay off. By the time we do get there, NV would have closed the performance gap if not leap frogged AMD and the last few years have not favored AMD in any form.
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
C'mon guys, RX Vega won't cost more than a card it's slower than unless it trades blows in popular games or holds some special advantage. Even then, 10% higher price?
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
C'mon guys, RX Vega won't cost more than a card it's slower than unless it trades blows in popular games or holds some special advantage. Even then, 10% higher price?

Someone was able to figure out which monitor was which at the Budapest demo. The price difference between the Freesync and G-Sync monitors was $300... the same as the price difference between the systems that AMD's representative claimed. In other words, RX Vega is going to be just as expensive as GTX 1080, be a bit faster in titles that favor AMD and probably slower in most other games, and will use 1.6x-2x as much power.

I still don't understand why they are even releasing a gaming version of Vega at all, when it clearly can't do the job. The handful of low-margin sales can't be worth the terrible publicity they will get. (Are they even going to sample them for reviews? At this point I suspect the answer is no.) They should cancel RX Vega and release just the Instinct and Pro cards, which might actually be able to compete in their segments without embarrassing AMD too badly. And there needs to be accountability for RTG's persistent failure to meaningfully improve since late 2013.
 
Mar 11, 2004
23,444
5,849
146
Someone was able to figure out which monitor was which at the Budapest demo. The price difference between the Freesync and G-Sync monitors was $300... the same as the price difference between the systems that AMD's representative claimed. In other words, RX Vega is going to be just as expensive as GTX 1080, be a bit faster in titles that favor AMD and probably slower in most other games, and will use 1.6x-2x as much power.

I still don't understand why they are even releasing a gaming version of Vega at all, when it clearly can't do the job. The handful of low-margin sales can't be worth the terrible publicity they will get. (Are they even going to sample them for reviews? At this point I suspect the answer is no.) They should cancel RX Vega and release just the Instinct and Pro cards, which might actually be able to compete in their segments without embarrassing AMD too badly. And there needs to be accountability for RTG's persistent failure to meaningfully improve since late 2013.

I really don't get this "well Vega failed, AMD should just completely write it off and not even release it" nonsense. It is so outright stupid that I can't fathom why there's so many people that keep stating it. We've consistently seen that mining means they can probably sell every card they can make even if its not price competitive to Nvidia in games. Heck even if things work in the opposite, where say some new mining development that Nvidia excels at but AMD somehow sucks at, so all the Nvidia cards get bought up and Nvidia's already premium pricing going to new levels, leaving AMD cards the only means for people buying new. It seems very possible that the changes will be like GCN in general, where AMD relies on their sustained software improvement to bring out their full performance (so, no there almost certainly won't be 40% improvement at release or even by the end of the year, but give it time, say 1-2 years, and I bet we could see a consistent 20-33% improvement, with it possibly pushing higher under ideal conditions such as DX12 games that utilize certain features well; I have a hunch that we'll start to see more of the tricks used on console games be utilized in the PC space, and One X 4K could benefit Vega especially well on some titles, and it will take a while before developers wring out what improvements the One X will enable). That won't happen if there aren't cards to work with.

Not to mention, not releasing, especially after they've already done PR for it, would be a disaster, bigger than releasing a subpar card (and it very well could just be that their initial implementation of Vega as a chip is a dud, and that it requires serious tweaking, more than just a respin, to bring out its potential - much like we saw with the evolution from the 2000 to 4000 series; with Navi possibly being a way they realize the gain made from the 4000 to 5000 series, doubling the GPU effectively as well as offering similar gains that we saw Crossfire make around the time by using more). Since they almost certainly wouldn't shelve Vega and publicly state that it is a failure (since then people would say that proves that AMD's architecture itself is a failure, killing its future potential including use in APUs and setting off a firestorm of BS; not to mention it would also effectively give the people that have been claiming that GCN is a total failure for years now, unwarranted validation; plus it could even hurt console sales). People are already calling for AMD to just kill their graphics business, imagine what would happen if they were to scrap a major release entirely. People seem to actively want them to go out of business for some bizarre reason.

This place gets more and more ridiculous. Prior to every single AMD release, this place gets worked up into a frenzy over rampant speculation (both ways, positively and negatively) and hyperbole. Then the release happens and it pretty much goes as expected (which is to say, about splits the rampant speculation; typically offering decent but not spectacular performance at worse than Nvidia power usage; with it over time improving in performance just not fast enough to keep pace with Nvidia's steady execution of releases), and as ever people have to decide on performance for the price (including if mining throws perf/$ way out of whack for gaming). And then we get the extended cesspit of nonsensical arguments (one I particularly like is the person that went around saying that AMD needs to stop making good mining cards, then when nVidia allegedly is working on versions of cards more suitable for mining than their consumer ones says AMD completely failed by not providing miner cards before, but either way they deserve to go out of business because AMD hasn't been releasing the magical unicorn Bitchin Fast 3D 3000 that does 16K 1000Hz VR in infinite colorspace for $200 in quantities such that people can buy 2, one to use for mining 24/7 where it'll pay for itself off after the first minute that you mine on it and the other so they can game). Only instead of people learning from how things have gone, we get people just taking all the rampant speculation and treating it as reality, thus further distorting their expectations and arguments. Seems like this place is due for another meltdown.
 
Mar 11, 2004
23,444
5,849
146
Some gsync models have a $300 premium over their freesync counterparts so it's highly likely that is the price difference they're referring to. I don't see them pricing RX Vega below $500.

That would be my guess. Depending on the performance and power use, I could see them undercutting the comparable Nvidia card by $50. So say its around 1080, they offer the standard air cooled for $50 less, and if there's a reference watercooled for $100 more than that. I would expect Nvidia to not get aggressive on pricing (since they have no need, heck with their "mini" versions of their cards AMD won't even have that relative advantage like Fury did), rather they'll probably react with bundles if necessary.

I think a smart move by Nvidia would be to do VR bundles, especially if they could get HTC to offer a good discount on the Vive (if they did a 1080 + Vive for $1000, that would help both companies). It would be enough that people looking at the recent Oculus sale would be willing to spend up for the Vive, plus it would ensure they would have very good VR performance. And if they could do a 1060 + Rift for $500 that would get a lot of people onboard as well. Plus it would let Nvidia tout their better VR performance, setting the consumer mindset about that as well. And there it wouldn't carry the G-Sync tax, so AMD couldn't use that argument. Which they could even just do card + GSync monitor bundle discounts that would put an end to that argument.
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
Someone was able to figure out which monitor was which at the Budapest demo. The price difference between the Freesync and G-Sync monitors was $300... the same as the price difference between the systems that AMD's representative claimed. In other words, RX Vega is going to be just as expensive as GTX 1080, be a bit faster in titles that favor AMD and probably slower in most other games, and will use 1.6x-2x as much power.

I still don't understand why they are even releasing a gaming version of Vega at all, when it clearly can't do the job. The handful of low-margin sales can't be worth the terrible publicity they will get. (Are they even going to sample them for reviews? At this point I suspect the answer is no.) They should cancel RX Vega and release just the Instinct and Pro cards, which might actually be able to compete in their segments without embarrassing AMD too badly. And there needs to be accountability for RTG's persistent failure to meaningfully improve since late 2013.
maybe they thinking miners will buy them anyway.Because same price as GTX1080 more than year later with 2x power is pretty much suicide at this point.Pretty much worst GPU ever made if price is true.
 

Elixer

Lifer
May 7, 2002
10,371
762
126
Vega is only a disaster if you wanted something faster than a 1080ti, that is just not gonna happen, and I am sure, by now, everyone agrees on this.

Vega still has uses for mining & very specific workloads (so far), so for those people, it works out well for them.

I do agree that it can't be the same price as a stock 1080--IF they want to sell it for gamers.

Whether they want to sell it to gamers is a good question, if they do, my guess is they will throw in some games. If not, well, those other guys will buy the cards.
 

Gideon

Platinum Member
Nov 27, 2007
2,013
4,992
136
Where are all these claims, that miners will buy vega coming from? Vega FE hashrate is atrocious for the TDP, which is 30-35 MH/s. An 1070GTX is 25 MH/s stock and can easily reach 28-30 with memory OC as Ehereum is very much memory bandwidth bound (which yet again won't OC well on Vega).

Miners aren't even buying similar hashrate but lower TDP cards like 1080 or 1080ti, yet they will suddenly buy vega?

And TDP matters, even when you disregard the power bill. Miners run rigs with as much cards possible. You can run 6 150w 1070gtx's with a 1200w power supply in a single rig. Good luck doing it with vega :p

IMO the sudden rage here: "no worries, miners will still buy them!" is wishful thinking
 

3DOSH

Junior Member
Jun 26, 2016
5
4
36
Where are all these claims, that miners will buy vega coming from? Vega FE hashrate is atrocious for the TDP, which is 30-35 MH/s. An 1070GTX is 25 MH/s stock and can easily reach 28-30 with memory OC as Ehereum is very much memory bandwidth bound (which yet again won't OC well on Vega).

Miners aren't even buying similar hashrate but lower TDP cards like 1080 or 1080ti, yet they will suddenly buy vega?

And TDP matters, even when you disregard the power bill. Miners run rigs with as much cards possible. You can run 6 150w 1070gtx's with a 1200w power supply in a single rig. Good luck doing it with vega :p

IMO the sudden rage here: "no worries, miners will still buy them!" is wishful thinking
Vega FE undervolts like beast. down from 1.2v to around 1.08-1.012v and RX Vega is gonna have the same hash rate if not better for lower price.
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
If AMD/RTG are smart - and I believe that they are - they will use the testing and feedback from Vega FE to dial in voltages, frequencies, and cooling in order to make RX Vega more appealing, especially AIB partner cards. In addition, of course, to better BIOS, driver, and software support.

RX Vega has to be a better card than Vega FE, doesn't it?
 

CatMerc

Golden Member
Jul 16, 2016
1,114
1,153
136
If AMD/RTG are smart - and I believe that they are - they will use the testing and feedback from Vega FE to dial in voltages, frequencies, and cooling in order to make RX Vega more appealing, especially AIB partner cards. In addition, of course, to better BIOS, driver, and software support.

RX Vega has to be a better card than Vega FE, doesn't it?
There's nothing consumers discovered in the Vega FE that they wouldn't have known about, with far more precision and understanding.

There are always voltage safeguards, you don't run cards near their minimum voltage. NVIDIA doesn't do it either, contrary to popular belief. You can in fact unervolt Pascal cards.
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
There's nothing consumers discovered in the Vega FE that they wouldn't have known about, with far more precision and understanding.
I'm not sure that's true. No amount of internal lab testing compares to samples taken from real world application. I'll agree that AMD/RTG knows much more than Vega FE customers/reviewers, they built the damn thing, but having feedback from hundreds or thousands of individuals devoted to tweaking and benchmarking is going to influence how they prioritize updates, marketing, and pricing.
 

CatMerc

Golden Member
Jul 16, 2016
1,114
1,153
136
I'm not sure that's true. No amount of internal lab testing compares to samples taken from real world application. I'll agree that AMD/RTG knows much more than Vega FE customers/reviewers, they built the damn thing, but having feedback from hundreds or thousands of individuals devoted to tweaking and benchmarking is going to influence how they prioritize updates, marketing, and pricing.
Of course, marketing and pricing is something influenced by the consumers. But things like picking a voltage is influenced by looking at a ton of their samples and picking the correct one for their yield targets.

AMD knows what the bell curve for samples looks like, they know what the best samples look like, and they know what the worst look like. There's nothing on that front that we can tell them about.
 
Status
Not open for further replies.