AMD Vega (FE and RX) Benchmarks [Updated Aug 10 - RX Vega 64 Unboxing]

Page 52 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

chimaxi83

Diamond Member
May 18, 2003
5,649
61
101
That is just silly, hyperbole extravaganza spam. You can have a GPU that consumes 1000w and stay cool if you have beefy enough cooling.

Sure more hot air is produced, but again its not going to make any real difference to the ambient temperature of the room for example, 350 to 400W is nothing, especially with effective cooling. You need 1.5kw heater to heat a 5x5 room properly in the winter and its a freaking heater, designed to heat.

Wattage is only important at looking at the electric bill. 100W isn't even a big difference at all, especially for a GPU which won't be at 99%-100% use most of the time.

So a difference of 100W is insignificant in every single way, especially when we are talking between say 300W and 400W, if its say 100W vs 200W you have to take into account psu, the cables, does it have 6pin or 8pin, etc...

Sorry, you have NO clue what you're talking about. 1000W will keep a room cool with beefy cooling? LOL? Where does that 1000W go then? It doesn't matter how cool you keep your GPU (other than slightly less power draw the cooler it is), whether it's using 200W or 400W or 1000W, that energy converts to heat, and just about every bit of that heat goes into the room. Period. Sure, 50-100W isn't much of a difference, but every bit of that still goes to heating up a room. The only "insignificance" if you can call it that, is to a power bill.
 

Technotronic

Junior Member
Jul 12, 2017
23
78
41
Because of AVFS (Adaptive Voltage & Frequency Scaling, new with Polaris) and ACG (Advanced Clock Gating, new with Vega). ACG is completely disabled as of now and AVFS isn't working properly.

Coming from a guy who is working for AMD.
https://www.forum-3dcenter.org/vbulletin/showthread.php?p=11421717#post11421717

Translation:
https://translate.google.com/translate?hl=de&sl=de&tl=en&u=https://www.forum-3dcenter.org/vbulletin/showthread.php?p=11421717#post11421717
Everyone just ignoring this? More and more leads pointing to Vega FE having some major features disabled for now..
 
  • Like
Reactions: tonyfreak215

Mopetar

Diamond Member
Jan 31, 2011
7,848
6,012
136
The alleged leaks for RX had the same memory clock speeds, so AMD may be cranking the voltage just to ensure all of the launch parts are stable. They overvolted Polaris at launch so we may see the same here. I think we'll see a lot of people being able to undervolt and get performance improvements once again.
 

Magic Hate Ball

Senior member
Feb 2, 2017
290
250
96
The alleged leaks for RX had the same memory clock speeds, so AMD may be cranking the voltage just to ensure all of the launch parts are stable. They overvolted Polaris at launch so we may see the same here. I think we'll see a lot of people being able to undervolt and get performance improvements once again.

Yeah I mentioned this elsewhere I think.

I have a strong suspicion most cards will be able to undervolt a bit at stock speeds.

Hell, my Fury took a -60mv at stock, my RX480 did -50mv at the OC stock, and my RX470 can take a -75mv from stock.
 
  • Like
Reactions: tonyfreak215

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
That is just silly, hyperbole extravaganza spam. You can have a GPU that consumes 1000w and stay cool if you have beefy enough cooling.

Sure more hot air is produced, but again its not going to make any real difference to the ambient temperature of the room for example, 350 to 400W is nothing, especially with effective cooling. You need 1.5kw heater to heat a 5x5 room properly in the winter and its a freaking heater, designed to heat.

Wattage is only important at looking at the electric bill. 100W isn't even a big difference at all, especially for a GPU which won't be at 99%-100% use most of the time.

So a difference of 100W is insignificant in every single way, especially when we are talking between say 300W and 400W, if its say 100W vs 200W you have to take into account psu, the cables, does it have 6pin or 8pin, etc...

Sorry, you have NO clue what you're talking about. 1000W will keep a room cool with beefy cooling? LOL? Where does that 1000W go then? It doesn't matter how cool you keep your GPU (other than slightly less power draw the cooler it is), whether it's using 200W or 400W or 1000W, that energy converts to heat, and just about every bit of that heat goes into the room. Period. Sure, 50-100W isn't much of a difference, but every bit of that still goes to heating up a room. The only "insignificance" if you can call it that, is to a power bill.

Exactly. In conditions where you are not actively cooling a room the power difference is rather minimal. However, in the summer with the AC on the effective power difference can be astronomical.

Your typical AC unit will at best have a Carnot efficiency of around 30-35%. Its effective efficiency will generally be quite a bit lower (you are not really exhausting the hot air efficiently) - removing 200W of heat can easily take a factor of 3-4x more energy from the AC.
 
  • Like
Reactions: tonyfreak215

Mopetar

Diamond Member
Jan 31, 2011
7,848
6,012
136
25% less or 33% more, depending on which card you use as base.

That's why I think it's better to say one has 75% of the resources of the other instead of saying 25% fewer. It makes it far more clear what kind of arithmetic is being performed and people don't get as mixed up.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
That is just silly, hyperbole extravaganza spam. You can have a GPU that consumes 1000w and stay cool if you have beefy enough cooling.

Sure more hot air is produced, but again its not going to make any real difference to the ambient temperature of the room for example, 350 to 400W is nothing, especially with effective cooling. You need 1.5kw heater to heat a 5x5 room properly in the winter and its a freaking heater, designed to heat.

I beg to differ. In a 10x12 room, after an hour or two that room would be 4-5+ degrees (Fahrenheit) warmer with my old overclocked GTX 460 and core 2 duo running full tilt for that duration.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I beg to differ. In a 10x12 room, after an hour or two that room would be 4-5+ degrees (Fahrenheit) warmer with my old overclocked GTX 460 and core 2 duo running full tilt for that duration.

Depends on insulation. On weekends in Winter (in Canada) I could leave my apartement heat off because I was running my computer (and lights, and TV and Stereo) all day.
 
  • Like
Reactions: tonyfreak215

Technotronic

Junior Member
Jul 12, 2017
23
78
41
SK Hynix claims 1.2V for 4-Hi stacks only. Vega FE uses 8-Hi stacks... more stacks/layers obviously needs more voltage.
Everyone too busy arguing about the efficiency of air conditioners to discuss real points like this.

No one expected AMD to offer 16GB from two stacks so its a reasonable assumption that these stacks may not be very efficient. Don't we typically see lower capacity memory chips clock better? Its very possible the 8GB RX Vega has faster memory.
 
  • Like
Reactions: nathanddrews

Mopetar

Diamond Member
Jan 31, 2011
7,848
6,012
136
No one expected AMD to offer 16GB from two stacks so its a reasonable assumption that these stacks may not be very efficient. Don't we typically see lower capacity memory chips clock better? Its very possible the 8GB RX Vega has faster memory.

That's a fair assumption, it's just that the supposed leaks put it a the same ~940 MHz memory clock as the Frontier Edition. This was based on some 3DMark11 scores for an unknown AMD card so there may be some credibility to it. Of course this doesn't necessarily mean that this is final hardware, but it's probably getting closer to that.
 
  • Like
Reactions: Technotronic

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Why is that?
Are you in the camp that thinks there is new silicon for RX?

new silicon? not a new chip. but they could disable something in vega fe that rx vega does not need.

my thinking tho was that the power management in the chip wasn't functioning properly for games.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
My 980ti pulling 300W+ right now in the middle of summer in an upstairs bedroom already gets my ambient room temps up to over 80F after a few hours of gaming, I can't imagine having anymore power than that, I would literally melt. This is of course in south Texas where we have summers that get well over 100F for weeks at a time.

I was okay with higher power draw at 1080ti level performance because it would offset the perf/watt from my 980ti, but if RX is going to hit 400W at 1600MHz it's a no sale now.

I too have started to get really annoyed with heat. Aftermarket 1080 Ti's are power-hogging furnaces (big reason I didn't want one), and Vega looks to be worse. 1080 non-Ti loses some efficiency with factory OCs, but at stock references the differences between Vega and 1080 are going to be hyperbolic.
 
  • Like
Reactions: Technotronic

chimaxi83

Diamond Member
May 18, 2003
5,649
61
101
Everyone too busy arguing about the efficiency of air conditioners to discuss real points like this.

No one expected AMD to offer 16GB from two stacks so its a reasonable assumption that these stacks may not be very efficient. Don't we typically see lower capacity memory chips clock better? Its very possible the 8GB RX Vega has faster memory.

Not really. People are being disingenuous, claiming hot computers don't heat up rooms. Just helping to clear up that misconception, its relevant here.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,332
4,926
136
I just updated the first post with some stuff I felt was worth updating. Any other solid reviews/benchmarks I should add?

Thanks for doing this OP. It's nice to have a member curating their thread with timely updates and keeping it as bias-free as possible.
 

Snarf Snarf

Senior member
Feb 19, 2015
399
327
136
Thanks guys, saved me the trouble of responding. I know it's difficult for some people to understand that climate can vary wildly depending on where you live, but central AC does not keep up with the heat output of my PC (overclocked 4770k and 980ti @ 1500 MHz) combined with the 104+ temps we've had in July. I've had to basically quit gaming during the day because the room will become unbearably hot after about 45 minutes of gaming.

Back to the whole point of my post, 300W is my hard limit on my GPU because I don't want to have to buy a window AC unit for my room just to be able to drive higher frame rates. I was willing to deal with a little more power at 1080ti levels of performance because I would be able to cap FPS at 60 on some of the games I play (WoW, Hearthstone, HoTS etc) and keep the heat generation under control but if RX Vega is going to need 400W and LN to sustain 1600 MHz like FE does it's a no go. I would pay $200+ more for a GSync and GTX set up to NOT have to deal with insane power consumption. First world problems and all, and I very well may end up getting a window unit for my room anyway because has anyone here ever tried to sleep in a room thats 85+ at night? It's awful.
 
  • Like
Reactions: insertcarehere

OatisCampbell

Senior member
Jun 26, 2013
302
83
101
That is just silly, hyperbole extravaganza spam. You can have a GPU that consumes 1000w and stay cool if you have beefy enough cooling.

Sure more hot air is produced, but again its not going to make any real difference to the ambient temperature of the room for example, 350 to 400W is nothing, especially with effective cooling. You need 1.5kw heater to heat a 5x5 room properly in the winter and its a freaking heater, designed to heat.

Wattage is only important at looking at the electric bill. 100W isn't even a big difference at all, especially for a GPU which won't be at 99%-100% use most of the time.

So a difference of 100W is insignificant in every single way, especially when we are talking between say 300W and 400W, if its say 100W vs 200W you have to take into account psu, the cables, does it have 6pin or 8pin, etc...
I've never been a big "power and efficiency" person for desktop GPUs either.

Same performance but 50W more and $50 less? I'm probably buying the $50 less.

I have a big case and don't mine, the CPU is on an AIO and only doing normal boost. (no OC)

It's sort of disheartening the Vega is showing less performance than the much lower power 1080 because that doesn't bode well for the APU and mobile markets, but I suppose it's to be expected as AMD just doesn't have NVs R&D budget.
 

Magic Hate Ball

Senior member
Feb 2, 2017
290
250
96
I've never been a big "power and efficiency" person for desktop GPUs either.

Same performance but 50W more and $50 less? I'm probably buying the $50 less.

I have a big case and don't mine, the CPU is on an AIO and only doing normal boost. (no OC)

It's sort of disheartening the Vega is showing less performance than the much lower power 1080 because that doesn't bode well for the APU and mobile markets, but I suppose it's to be expected as AMD just doesn't have NVs R&D budget.

I would not worry about the power for APU. They down clock those heavily and the process that AMD has these things on runs very efficiently at lower clocks.
 
  • Like
Reactions: tonyfreak215

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Thanks guys, saved me the trouble of responding. I know it's difficult for some people to understand that climate can vary wildly depending on where you live, but central AC does not keep up with the heat output of my PC (overclocked 4770k and 980ti @ 1500 MHz) combined with the 104+ temps we've had in July. I've had to basically quit gaming during the day because the room will become unbearably hot after about 45 minutes of gaming.

Back to the whole point of my post, 300W is my hard limit on my GPU because I don't want to have to buy a window AC unit for my room just to be able to drive higher frame rates. I was willing to deal with a little more power at 1080ti levels of performance because I would be able to cap FPS at 60 on some of the games I play (WoW, Hearthstone, HoTS etc) and keep the heat generation under control but if RX Vega is going to need 400W and LN to sustain 1600 MHz like FE does it's a no go. I would pay $200+ more for a GSync and GTX set up to NOT have to deal with insane power consumption. First world problems and all, and I very well may end up getting a window unit for my room anyway because has anyone here ever tried to sleep in a room thats 85+ at night? It's awful.

c6pbWxc.png

you should be fine with at least some version of vega, unless this information is completely off.
 

Technotronic

Junior Member
Jul 12, 2017
23
78
41
c6pbWxc.png

you should be fine with at least some version of vega, unless this information is completely off.
Yes this. Its become obvious that whatever design frequency a chip is designed for on 14nm.. that it typically has a sweet spot. This conversation will get a lot more exciting once we know actual RX Vega performance and can then start to discuss perf/watt. For now these abstract power consumption figures are basically meaningless to me.
 

Snarf Snarf

Senior member
Feb 19, 2015
399
327
136
I'm still holding out hope for the miracle of FE being different silicon than RX Vega. The money saved from going Vega and FreeSync would pay for my window unit after all if power consumption isn't insane :)
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
I'm still holding out hope for the miracle of FE being different silicon than RX Vega. The money saved from going Vega and FreeSync would pay for my window unit after all if power consumption isn't insane :)

doesn't need to be different silicon to behave differently. how hardware behaves is down to configuration of the hardware as well as the software controlling it.
 
Status
Not open for further replies.