Vega FE in hand - info source

DaQuteness

Senior member
Mar 6, 2008
200
0
86
#1
I've just received the Vega FE, if anyone is curious about performance or has any specific tests or software in mind, now's the time to ask away as much as you want, this is about as close as you'll get to getting fast responses to Vega FE related questions - I have never mined a day in my life so I can't provide any reliable responses to that because quite honestly I have no experience with it, I can try though if need be.

Now, this is a purely informative thread, it's for people who are looking for information just like I've been trying to find and just got fed up with all the shit opinions that are spewed all over the web. It's not a place for fanboys, not banter, not criticism, but pure, informed opinions and requests is what I hope will come out of this.

I got the FE for £900 which is not bad considering it's a PRO card, and before you say I'm an AMD fanboy, I had Intel CPUs until Ryzen came out, and my main card is an EVGA 1080Ti FTW3 that overclocks to high heaven and was £130 cheaper. However, in pro apps (Catia, Revit, Maya to name a few I've used), it sucks big time and the equivalent card from green team has an extra digit to it. Even more incentive to buy it, if it's a fast renderer, and I believe it is faster than the FTW3 but will confirm that soon, it's time saved and get return of investment in 1-2 projects.

Having said all that, I'm open to any questions or curiosities anyone might have in regards to Vega FE :)

I will have a shot at undervolting just FYI.

I'll be posting benchmarks as I get results starting tomorrow.
 
Nov 29, 2006
108
0
91
#2
I'd like to see Rendering performance in Adobe Premiere Pro CC and Adobe After Effects.
 

DaQuteness

Senior member
Mar 6, 2008
200
0
86
#3
Apologies for taking so long to respond, but there's a good reason for that.

Vega FE seems to be in a bit of a limbo at the moment in terms of drivers, and I am unable to get consistent results because I was limited to the inaugural 17.6 driver. Beta driver 17.8.2 for FE was bugged to hell so I just let it go, and with 17.6 Wattman is not stable, the clocks are not stable, and for the same settings I could get a results variance between restarts of as much as 20%, but I can tell you this much. It will outperform 1080Ti in terms of compute power, it already did consistently in Luxmark even with shitty drivers.

Anyway, the few tests I've managed to get through are summed up here:

----------------- 17.8.2 FE Beta6 Drivers

Luxmark 3.1 - 4886 Hotel / 16336 Mic / 22505 Ball
Blenchmark - 01m:21s - horrible...

------------------ 17.6 FE Drivers

Luxmark 3.1 - 4717 Hotel / 16205 Mic / 23940 Ball
Blenchmark - 32.7s (1080Ti FTW3 - 52s) / 26.8s @512x512 tile size (EVGA 1080Ti FTW3 - 24s)

------------------ 17.6 FE Drivers, Undervolted, +50% Power

Luxmark 3.1 - 5695 Hotel / 19359 Mic / 29525 Ball (for reference, Ball scene on my FTW3 only gained ~21k)
Blenchmark - 32.7s default tile size / 24.2s @512x512 tile - matching my EVGA 1080ti FTW3 render times)

It really immensely depends on type of workload. Optimise what you're working on a bit and you got at least the same performance as a 1080Ti in non-gaming environments. Another thing I've noticed is that undervolting and allowing the power to go up will increase the top-end performance

The new drivers, when they'll come out, will act more like a hub for switching between PRO and RX drivers, allowing for 1 Pro driver and 2x RX drivers (if you wanted to have a stable and a beta at the same time for instance).

I'll post updated results, for those interested in the FE, when the new drivers come out, for now I've parked it due to performance instability issues.

Cheers!
V.
 

Soulkeeper

Diamond Member
Nov 23, 2001
6,443
20
91
#5
I just wish undervolting/underclocking was possible in linux.
A custom bios being supported would be ideal, or atleast an app or sysfs file for setting BOTH options would be nice.

Untill then my vega FE will continue to sit, unused, on a shelf collecting dust.

It hit 110F in my area today ... This card isn't even stable on days where it hits 80F. Insane stuff.
 

DaQuteness

Senior member
Mar 6, 2008
200
0
86
#6
A new set of drivers should be out on the 13th this month, with the launch of WX9100 and SSG. Maybe then it will make a difference. I know a few people in the same pot as us, basically wanting their FE to shred charts but having absolutely shite driver support. Give it a few weeks until new drivers come out and features hopefully get enabled, I'm using my 1080Ti ftw3 in the mean time. Having a £900 brick is not very fun for now.
 

DaQuteness

Senior member
Mar 6, 2008
200
0
86
#7
Not many people seem to care about the FE anymore, so I'm going to leave some words here for those that do, either in the forum or externally. I honestly don't care if this thread gets locked after this opinion expose'.

Also, before calling me an AMD fanboy or whatever you want, you should consider my daily driver is an EVGA 1080Ti FTW3, and Ryzen is the first AMD CPU I've had since the first Sempron series... This is just my summed up opinion on AMD, and specifically the Frontier Edition.

Right... so, for the irrationally ambitious of us who have purchased an FE without working in a sterile dungeon research facility: I was on the AMD community yesterday chatting with someone and we both realized that the changes coming in the new driver are not completely in line with what has been presented in the first driver.

There are only three Vega-based cards that will be supported by this new update: FE, SSG and WX9100. Now, with the exception of the FE, the other two are decisively NOT gaming cards (neither is the FE, yet it performs at least as good as 64 when drivers were equal and the PCBs are nearly identical), and therefore this update will not include the single most beneficial tool that can boost a card's efficiency: WattMan. Yep, that's right, everything else RX is there except no out-of-box OC solution for the FE which I guess it's because of the other two, who are meant to be 100% stable over everything else, as they are meant to be abused 24/7 in a pro environment. Maybe WattTool will still work on the FE, we'll wait and see.

Having said that: Gaming mode is now replaced by "Driver Options". So, instead of actually switching to a full-on RX driver, in reality you'd likely only load specific parts that make the FE compatible with games. I won't say Optimised for gaming since we know damn well it's at least as good as an RX64, yet we're being robbed of WattMan which would ultimately offer increased efficiency. AMD have decided to take away the possibility to turn the FE into a better card. Brilliant...

But there is a silver lining to this: since the FE is part of the Pro series of cards, it will benefit from increased support, for a longer period of time and theoretically with higher frequency and hopefully will be included in the quarterly enterprise drivers since it's in the same bag with WX9100 and SSG.

Getting a bit tired of this whole drivers crap, but I have to admit I won't return the FE despite the stupid price tag and here's why: First of all, and I want to get this out of the way, I just honestly love the design of it and love the pro/gaming bipolar attitute, even when considering the disappointing, sub-par initial performance due to using old (i assume Fiji) drivers on Vega. Despite the stupid amount of heat, AMD have actually pulled off something interesting this year: managed to have a good hard swing at nVidia and Intel, not enough to rob them of their crowns, by no means, but still, a punch from the depths and aiming very very high!

Look at the bigger picture, if AMD refine their production and driver support, nVidia will have a problem from the mid-range downwards and Intel already feels the burn due to the new Zen architecture, which offers by far and wide the best price/performance ratio out there and excellent support - same motherboard will last you at least a couple of generations of CPUs, and while Ryzen was a stuttering start, it is now a truly performant contender for content creators.

I'm not defending AMD, by all means Vega was a tragic launch to say the very least, and Volta will make it seem pathetic in terms of gaming performance, that's for sure, and to put the final nail in this coffin, it's not up to the expectations of Vega's core-audience, gamers. Actually, apart from brand loyalty, no reason at all to buy Vega if purely for gaming. None... not with the current price tag anyway!

But the FE itself does offer something no other company has done before: Gaming AND Pro benefits on the same card. Interesting concept that I really think will prove invaluable for creative professionals such as myself. For that alone I think it's worth keeping and before some of you get pissed off and invoke the "Placebo mode" as described by Gamers Nexus, the next iteration should be a different animal, we'll just have to wait and see. Even AMD sort of admit it was crap in their features chart on the Crimson Pro main page. Not to mention it's got quite a few extremely important features disabled which should further boost performance - more info on that in the Vega Whitepaper.

The part that most FE owners seem to miss is it's ultimate Pro aspect: it shares the same architecture and instruction features as the Instinct series, meaning you can program a prototype GPGPU or AI on it (playing fast and loose with the term AI here) and then mass-deploy on Instinct compute frameworks, which is invaluable for research scalability such as in medical and financial sectors. All the while having a decent gaming GPU.

If all you're looking for is GPU rendering, go for RX series... the 64, if your scene is correctly optimised, will top any other card in the same price with the same settings, guaranteed, even with inflated prices. Just as a reference, the Blender BMW27 scene was rendered in 1:10s by 4x GTX 1080 (since that's what it's being pitted against)... and 59s by only two RX64's (tested by members of the official AMD community forum)

All in all, for gamers specifically, Vega is somewhat of a failure at it's current price point, except maybe the 56, and honestly redundant when compared to upcoming Volta. For content creators and developers/researchers however it is actually extremely appealing. I'm happy there is finally competition and innovation in my sector, and it's why Vega is such an important landmark despite it's failed first steps. These are good times to build systems.

Again, I'm not trying to convince myself or anybody else of anything. Just laying down my thoughts in this forum, that's all.

Sorry for the long post, here's a few Potato FE :)

 
Last edited:
Apr 27, 2000
12,806
1,581
136
#8
If Vega FE has access to the Pro feature stack, does that mean it gets 1/2 rate FP64 using the Pro drivers?
 

tential

Diamond Member
May 13, 2008
7,363
0
121
#9
What makes this GPU good for AI? I'm just starting to explore this and am curious. I thought you could use a vega gaming GPU to the same effect anyway?
 

DaQuteness

Senior member
Mar 6, 2008
200
0
86
#10
What makes this GPU good for AI? I'm just starting to explore this and am curious. I thought you could use a vega gaming GPU to the same effect anyway?
There's no straight answer to this, but the best shot would be saying that it's compute pipeline and massive memory makes it more suitable for deep learning. I'm personally not an expert in GPU architecture and deep learning although am deeply interested and fascinated by both.

Vega FE is a weird and heavily debated card as it's aiming for almost all capabilities at once: gaming, production, CAD, machine learning, it's got PRO/WX series capabilities but it's also the stepping stone to the Instinct series. To an extent it shares the same architecture and instruction sets, the same compute pipeline, all geared for fast complex bi-directional math operations, large memory and bandwidth to support advanced algorithms that process data in an inter-connected instruction node system - hence the "neural network" and "deep learning" terms. For most part, this field in relation to GPUs is heavily influenced by drivers and which instructions are accessed through it - hence a massive amount of deep learning material is based on CUDA while Instinct is brand new, using the ROCm platform in Linux.

You CAN use almost any dedicated commercially available GPU that has CUDA or OpenCL support for some deep learning messing about but performance will be limited with increasing complexity, or not even run your simulations depending on what you're trying to achieve. Gaming GPUs are built for read-write in a much more linear fashion and have less capacity, i.e. here's the memory, see what you need and spit out based on user's input. On a neural network, just like in our brains, data travels A LOT before providing an output/result, and it gets better at recognizing features with each training session, it memorizes traits and stores them in a long-term memory, after filtering everything out in short-term - hence the massive memory requirements of this field. Deep learning is an amazing field to look into, and you have to be extremely dedicated and geared towards some funky math.

If you want to learn more about deep learning try this for starters https://uk.mathworks.com/help/nnet/deep-learning-training-from-scratch.html

I hope this helps a bit, sorry for the long and delayed answer :)
 

DaQuteness

Senior member
Mar 6, 2008
200
0
86
#11
If Vega FE has access to the Pro feature stack, does that mean it gets 1/2 rate FP64 using the Pro drivers?
It has 1/16 from what AMD was saying, although I only found this mentioned here: http://techreport.com/news/32163/updated-radeon-vega-frontier-edition-launches-today-for-999-and-up
http://techreport.com/news/32163/updated-radeon-vega-frontier-edition-launches-today-for-999-and-up
Don't know where they got that from as I didn't have time to look it up. Seems fairly low to me, I'm willing to bet that rate will go up with better drivers - which seem to be taking forever to get out, I am STILL on 17.6 which is getting a bit frustrating.
 

Similar threads



ASK THE COMMUNITY