AMD Vega (FE and RX) Benchmarks [Updated Aug 10 - RX Vega 64 Unboxing]

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I don't think it's that bad. Depends upon how it's priced.

The Neogaf posts I read something along the lines of big power hog, but great perf / price. So I'm guessing if it's faster than a GTX 1080, $500-$550. But this makes me think even more this card would be rare at that price point if AMD can get more from professionals.
 

[DHT]Osiris

Lifer
Dec 15, 2015
17,370
16,645
146
The Neogaf posts I read something along the lines of big power hog, but great perf / price. So I'm guessing if it's faster than a GTX 1080, $500-$550. But this makes me think even more this card would be rare at that price point if AMD can get more from professionals.
I got an aftermarket 1080 for 530 *seven months ago*. I bought 5 1080ti's a month ago for $665/ea. You can't shine that turd without some major discounts.
 
  • Like
Reactions: tviceman

[DHT]Osiris

Lifer
Dec 15, 2015
17,370
16,645
146
Eh, I'm picking up a C32HG70 with Freesync for, if rumors are to be believed, $300 less than what the G-Sync version will cost.

Can't polish that turd.
Fair enough, g-sync's pricing is funny, but my 1440p panel will probably last me through 2-3 cards I'll wager, and the 1080ti's are headless, so....
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
The Neogaf posts I read something along the lines of big power hog, but great perf / price. So I'm guessing if it's faster than a GTX 1080, $500-$550. But this makes me think even more this card would be rare at that price point if AMD can get more from professionals.
How does one calculate perf/price without a price?
 
  • Like
Reactions: 3DVagabond

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Woof. Welps, still got the caveat of wait for RX Vega. But...not looking good. Not sure how AMD (or us as consumers) expect improvements if this thing is say still slower than a GTX 1080 Ti with "proper" drivers, probably cost more to make, and will possible (as one poster said) sell for less or just not exist in the "gamer" side with all dies going to the "professional" side.

Well, at least it almost being 2018, I'd give a little more value to the "wait for DX12 games" propaganda that bolstered Fury back in 2015.

EDIT:



If AMD were smart, they've have switch to focus on miners back in 2014. Gamers will be upset, but when you have money coming in to fund better R&D, you can win gamers back later (maybe).

But for a good portion of the last 5 years AMD needed MONEY. They had a surefire way to make that money but they didn't even bother.
"Sure we'll sell you this GPU for $200. What, you're just going to resell it for $500? We we still got our cut."
Bitcoin mining was taken over by ASICs making GPU mining dead.

Etherium difficulty have skyrocketed and soon its dead too.

Risky to make cards for mining when they all suddenly come to a full stop.
The way this card has been released, the secrecy, no benchmarks etc... I'm not expecting anything but disappointing gaming results.

AMD didnt even give professional users reviews of FE Vega. The very users FE Vega was suppose to be interesting for.

AMD for sure seem insecure about the card. Just think about all the events Raja have been mentioning Vega like it was an amazing architecture?

I hope for AMD and Raja's sake RX Vega is for whatever reason much better than FE Vega
 
  • Like
Reactions: tviceman

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Eh, I'm picking up a C32HG70 with Freesync for, if rumors are to be believed, $300 less than what the G-Sync version will cost.

Can't polish that turd.
Yeah I think thats a big part of it. People looking at top of the line cards will likely be pairing them with top of line monitors so you can look at the full ecosystem cost. Looking at the 144Hz 1440p IPS monitors (currently seems like the sweet spot on quality gaming monitors) GSync adds about a $200-$300 premium over Freesync. Now GSync certification means you don't have to do as much research to find out about bad implementations (I'm looking at you Samsung CF791) and the dedicated hardware can give better results so there are some advantages to the price premium (then again some recent Freesync products, mostly from Nixeus, are apparently bringing specialized hardware for Freesync and maintaining the cost advantage).

It certainly isn't a slam-dunk for either side but the monitor+gpu ecosystem smoothes out some of the performance discrepancies. It also creates some vendor lock-in that may create problems for AMD going forward if they continue to appear uncompetitive.
 

GoodRevrnd

Diamond Member
Dec 27, 2001
6,801
581
126
Eh, I'm picking up a C32HG70 with Freesync for, if rumors are to be believed, $300 less than what the G-Sync version will cost.

Can't polish that turd.
Sorry to sidetrack... They're releasing a GSync version of that exact monitor? Also, please post detailed impressions of monitor after using it.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
How does one calculate perf/price without a price?

The posts I'm referring where allegedly from AIBs. So I'm assuming they know the final price, thus their comment.

Bitcoin mining was taken over by ASICs making GPU mining dead.

Etherium difficulty have skyrocketed and soon its dead too.

Risky to make cards for mining when they all suddenly come to a full stop.

The cards going to be made. The only difference is the MSRP. You aim at miners, sell as many as you can while you can at an inflated price. You bring in more money. When the bubble bursts, you made your money, you relist your cards for a decent value. You don't get burned as bad as how AMD handled it. They were selling their cards at good MSRP but retailers/etailers were jacking them up sometimes even close to double MSRP. AMD didn't see a dime of that. What AMD did see was when the bubble burst and all those cards (the ones also bought during the small window before the prices sky rocketed) selling used for at times half the cost of brand new cards. So AMD had to reduce price to even compete against themselves.
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
AMD has repeatedly stated that they are "focused on developers and content creators" and they have some special driver optimizations specifically for those use cases. Other than offering a "game mode" in the drivers, they are not focusing on that for reasons that only appear obvious. In benchmarks using those pro applications, it is "up to 70% faster than the GTX Titan Xp for less money" and that's the narrative they want to own.

The problem is that NVIDIA markets the Titan Xp as a halo gaming card, where AMD has created this artificial bubble in which they can beat the Titan Xp in select situations where a Quadro or FirePro card would be most appropriate and effective, but still lose gaming benchmarks to the Titan Xp (based only upon limited FireStrike scores). The other problem is that gamers have been foaming at the mouth for Vega and now it shows up with possible issues and so-so FS benchmarks. AMD needs to get control of the narrative again before Vega is written off as a failure in its entirety - some of you are already banging that gong.

If this marketing move by AMD proves successful and Vega FE starts to make a dent, NVIDIA can just release equivalent driver updates for the GTX Titan Xp to accelerate the same applications while also keeping its performance advantage in games. That's a sucky place for AMD to be, I think. For now, I'm going to assume we're looking at poor driver optimization and very unscientific benchmarks.

Hey, if Vega truly ends up as a disappointment at least Ryzen was a home run. Every once in a while, you have to bunt. :D
 

KompuKare

Golden Member
Jul 28, 2009
1,228
1,597
136
I disagree. Vega was supposed to the first major architectural update to GCN. If Vega 10 cannot even get to within 10% of a stock 1080 Ti its an unmitigated disaster.
This may be so, but the two biggest products for AMD this year are EPYC and the Zen APUs (that neither mainstream Ryzen or HEDT Threadripper are the important products needs repeating in most Ryzen threads over on the CPU forums).

So while Zen APU with Vega cores performance is currently unknown, it will be real test of whether Vega is a successful architecture or not.

Well that's what a sensible AMD would have aimed for as neither the GPU or the HPC are big profitable markets for them. Of course, in the case of HPC pride and almost desperation to break into that market might have hurt their designs in other areas.

Of course, it is possible that in the APUs the graphic portion performs poorly and the only major enhancements come from swapping to Zen based cores with the GPU architecture being no better than the current GCN Gen3 one.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
The card loses to a slightly OC'd (1150) Furx X in fire strike graphics score. Which makes absolutely no sense whatsoever. I think that many surprising turns have yet to come.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
The card loses to a slightly OC'd (1150) Furx X in fire strike graphics score. Which makes absolutely no sense whatsoever. I think that many surprising turns have yet to come.

They better come quick because it seems this info is now starting to hit the "mainstream" forums I frequent. And people are getting off the ride.

I just want a good Radeon product :( Just cut RTG off and sell them to Intel! :D

EDIT:

Yeah, seems its the "Gaming Mode" that (did AMD advertise this before, I don't recall hearing about it until yesterday) is creating the stink. If Gaming Mode is as AMD worded (switch to Radeon mode have all the same effects!) it's killing perception for this product. If Vega FE in "gaming mode" is competing with an OC'd GTX 1080 - dunno what to expecting from RX Vega.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
The card loses to a slightly OC'd (1150) Furx X in fire strike graphics score. Which makes absolutely no sense whatsoever. I think that many surprising turns have yet to come.
Are you looking at the graphics score or the overall score?
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
It can't complete in gaming (1080Ti is faster, cheaper, cooler, better o/c, ....).

It can't compete in pro benchmarks because even overpriced proper pro cards are competitively priced against it (e.g. P4000). This is aside from the fact no one is going to spend $1000 on a card that doesn't have proper certified workstation drivers.

It's too expensive for mining (at best it'll be twice as fast as a 580 or equivalent but it costs a lot more then 2 of them).

It's not going into super computers or blades or anything like that - this is clearly a consumer card.

It's doesn't have class leading performance like the titan XP (i.e. fastest in gaming) so you can't sell it to the class of people that will spend silly money just to have the best.

Really the frontier card has no market that I can see?

On another forum someone said its market is lawyers. This is merely to get a H1 2017 launch which investors were told.
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
Sorry but those scores don't make much sense.

23120 Fury X @ 1125

18767 Fury X @ 1050

That's a 23% gain from only 7% OC

Was there a big memory OC on the first run? I remember the Fury X getting some sizeable increases in certain benchmarks from overclocking the memory.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Memory OC.
Was there a big memory OC on the first run? I remember the Fury X getting some sizeable increases in certain benchmarks from overclocking the memory.

It only says 530 which is a 30hz OC for memory, or 6%

I just tested my Fury Nitros @ 1000 and 1100 and got ~9% faster with a 10% OC

Nitro @ 1000: 15032
Nitro @ 1100: 16352
 
  • Like
Reactions: Phynaz
Status
Not open for further replies.