Review AMD Radeon VII review and availability thread

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Hitman928

Diamond Member
Apr 15, 2012
6,642
12,245
136
Availability
------------------------------------------------------------------------------------------------------------------
As of the time of this post, In stock at newegg. (edit, and it's sold out).

Available straight from AMD.



Written Reviews
------------------------------------------------------------------------------------------------------------------
Techspot
See post #2 for more

Video Reviews
-------------------------------------------------------------------------------------------------------------------
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,630
136
It’s funny how the situations have reversed. AMD was raising margins in their GPUs which were competitive because the were getting thrashed in CPUs and they needed the GPU revenue to help keep them afloat.

Now it’s the CPU division that is doing well and needs to help make up the slack for the graphics division. Maybe they’ll eventually get both cylinders firing at the same time. If nothing else it means they’ll have some really good APUs.
 

insertcarehere

Senior member
Jan 17, 2013
712
701
136
As a reminder, here's JHHs comments on the Radeon VII

https://www.techpowerup.com/251400/...the-performance-is-lousy-freesync-doesnt-work

"The performance is lousy and there's nothing new. [There's] no ray tracing, no AI. It's 7nm with HBM memory that barely keeps up with a 2080"

He was exactly right, AMD had a big process node advantage compared to RTX and still loses perf/watt badly to RTX (and GTX 10-series fabbed on 16nm). This is a very underwhelming product.
 

DrMrLordX

Lifer
Apr 27, 2000
22,692
12,637
136
Meanwhile, the card is sold out even at AMD.com, and they are selling for $999+ on eBay. Hmm! Oh well.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I wonder when will AMD stop lying to us? They showed again fake benchmarks where Vega beats the RTX 2080 when in reality it is slower then a GTX1080TI :rolleyes:
https://www.computerbase.de/2019-02/amd-radeon-vii-test/3/

Are you trolling or what ??

Personal attacks are not allowed in the tech forums.
This includes calling others "trolls" or saying they
are "trolling".


AT Mod Usandthem


6baa9e79b910b8e99d582ab4579f62ad160f56a7.jpeg


From your own link

6Y2D4Fz.png



And for the other two games that are on the AMD list above

https://www.guru3d.com/articles_pages/amd_radeon_vii_16_gb_review,15.html
biLd3xb.png


and

5RvtYVp.png
 
Last edited by a moderator:

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Yes, by normal marketing standards that pretty much qualifies as stunning truth telling :) Insane amounts of cherry picking, but that's more or less par for the course.
 

vissarix

Senior member
Jun 12, 2015
297
96
101
False quote removed

Nice try dude but AMD was trying to show this gpu as faster then RTX2080, if you read this forum everyone thought it was going to be faster based on those cherry picked benchmarks...In the end you need to overclock it to beat a 2 year old GTX1080TI which can be overclocked aswell and beat this gpu once again ;)


You really need to learn to follow the
forum posting guidelines. You do not
falsely quote a user in the tech forums.
I suggest you re-read the rules while
you have some free time.


https://forums.anandtech.com/threads/anandtech-forum-guidelines.60552/

AT Mod Usandthem
 
Last edited by a moderator:

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
Nice try dude but AMD was trying to show this gpu as faster then RTX2080, if you read this forum everyone thought it was going to be faster based on those cherry picked benchmarks...In the end you need to overclock it to beat a 2 year old GTX1080TI which can be overclocked aswell and beat this gpu once again ;)
I think you could just chill out a bit before you bash other people's opinion in caps lock next time
 

BigDaveX

Senior member
Jun 12, 2014
440
216
116
It’s funny how the situations have reversed. AMD was raising margins in their GPUs which were competitive because the were getting thrashed in CPUs and they needed the GPU revenue to help keep them afloat.

Now it’s the CPU division that is doing well and needs to help make up the slack for the graphics division. Maybe they’ll eventually get both cylinders firing at the same time. If nothing else it means they’ll have some really good APUs.

The unfortunate truth seems to be that since AMD bought out ATI, their CPU and GPU departments have never really been both been operating at peak efficiency.

2006-2009 they were well behind Intel on the CPU front, and nVidia on the GPU front.

2010-2011 was probably the nearest we got to their CPU and CPU sides both kicking ass, as there was virtually zero reason to buy a Fermi over a 5800/6900-series card unless you needed the compute power that CUDA could offer, and the Phenom II X6 was a pretty solid CPU even if it couldn't quite hang with the hex-core i7s.

2012-2016 was when their CPU side was really floundering with Bulldozer and its descendants. The GPU side didn't have quite so much of clear-cut advantage against Kepler and Maxwell, but the former's sometimes finicky drivers and the latter's abysmal DX12 performance kept AMD the #1 choice of most enthusiasts, even if Fiji was a bit of a letdown. Plus, there were two big design wins with the PS4 and Xbox One (and hey, I'm sure they earned enough from the Wii U for a decent office party or two).

2017 to present has seen their CPU side rebound spectacularly, but their GPU side has stumbled quite badly, seeing their top-end chips go from competing with the 980Ti, to only just managing to beat the 1070 and 2070.
 

Mr Evil

Senior member
Jul 24, 2015
464
187
116
mrevil.asvachin.com
And it's here already:
rvii.jpg


Compared to my previous Sapphire Nitro Vega 64:
  • It's a much more reasonable size (which is unsurprising, given that the Sapphire is the most hugenormousest card ever devised by human kind)
  • It uses about the same amount of power.
  • The performance increase is about what the reviews suggested, although the game I'm currently playing, X4 Foundations, sees much less improvement due to often being CPU limited.
  • It's quiet at idle, even though the fans don't turn off completely like the Sapphire's fans did.
  • It's very loud under load. 3K RPM, yet it still reaches 100°C!
  • Wattman options are sparse. I haven't tried undervolting yet.
  • It has a nice and simple aesthetic.
  • The packaging is almost entirely cardboard, and the outer layer, though printed in full colour, is not glossy. This should make it a little more sustainable than most packaging these days, which is good (This is from PowerColor. Other brands might be different).

Overall it's... adequate. A worthwhile upgrade over AMD's previous cards if you're willing to put up with the state of GPU pricing today, and the noise. At 4K, a 25% increase in performance is very desireable, so I'm happy with it for now.
 

EXCellR8

Diamond Member
Sep 1, 2010
4,039
887
136
Looks nice... not sure I agree with it being a worthwhile upgrade over 14nm Vega though. Still seems to run too hot for my liking and if it's loud, well, ew.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
they shoulda just slapped an AIO on it and been done with it like the last 2 high end Radeon's
 
  • Like
Reactions: Feld

Hitman928

Diamond Member
Apr 15, 2012
6,642
12,245
136
And it's here already:
  • It's very loud under load. 3K RPM, yet it still reaches 100°C!

Wow, it got there fast! Congrats on the new card.

Out of curiosity, the 100C temperature you mentioned, is that GPU temp or junction temp?
 

PhonakV30

Senior member
Oct 26, 2009
987
378
136
Can you do UV (-100mv ) with clock 1700mhz ? I need to know junction temp / Fan rpm , thanks.
 

Hitman928

Diamond Member
Apr 15, 2012
6,642
12,245
136
Junction Temperature as reported by WattMan. The one labelled just Temperature peaked at about 75°C. Room temperature here is 22°C today.

Ok, thanks. Junction Temperature is basically a hotspot reading from what I understand and isn't comparable to what at least prior AMD report as their temperature. Vega actually also had a hotspot temperature sensor but never displayed it within Wattman (I believe you can turn it on as an option in GPUz). In other words, what Vega64 listed as its GPU temperature is what Vega VII lists as just temperature. With that said, the hotspot (or junction) temperature is what the card looks to in terms of throttling (I think 110 is the limit) but that shouldn't be used in comparison to other cards in terms of how hot Vega VII gets. If you notice, the vast majority of reviewers just reported the "Temperature" results and not the Junction Temperature results.

(Just trying to add clarity for sake of discussion).
 
  • Like
Reactions: lightmanek

Hitman928

Diamond Member
Apr 15, 2012
6,642
12,245
136
I use Linux a lot for work and a decent amount in personal use as well so I'm always interested in any Linux hardware testing. Anyway, reading through phoronix's review, it looks like if you want to game on Linux and use an AMD GPU, then the Vega VII is far and away the card to use and actually matches it's Linux gaming competitiveness with it's Windows competitiveness. Probably the first time that's ever happened, which is interesting.

Also has a very strong showing for machine learning and compute performance and performance per watt, but I don't do anything in that area.

https://www.phoronix.com/scan.php?page=article&item=radeon-vii-linux&num=1
 

Mr Evil

Senior member
Jul 24, 2015
464
187
116
mrevil.asvachin.com
Can you do UV (-100mv ) with clock 1700mhz ? I need to know junction temp / Fan rpm , thanks.
Now this bodes well for when I come to undervolt the card properly. Bearing in mind that these are quick, rough numbers because I don't want to take the time to do properly controlled tests at the moment...

I dusted off Kingdom Come - Deliverance, since that's a graphically intensive game, and stood still in a random spot in the countryside. I left it for about 20 minutes to warm up before doing anything, then used WattMan to get the average and peak readings over 10 minutes at each setting. FPS is measured with the in-game counter, and power with my plug-in AC power meter.

First row is the default, second is underclocked and undervolted.

Code:
Set Frequency   Set Voltage    Fan Speed (avg/peak)  Junction Temp (avg/peak)  FPS    AC Power
1801MHz         1071mV         2911/2938rpm          109/112°C                 45     368W
1701MHz         970mV          1721/2912rpm          99/109°C                  44     318W

That's a pretty big improvement for an insignificant drop in performance. The difference in performance is small because the actual clock speed wasn't much different between the two.

Although the peak fan speed when undervolted was still nearly 3K, it only ramped up to that speed once. The rest of the time it was at a constant, and dare I say quiet, speed.
 

Hans Gruber

Platinum Member
Dec 23, 2006
2,501
1,342
136
It would be nice if they offer a radeon VII with 8GB of ram as well. I think people should consider a 120mm AIO cooler for this new card.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Now this bodes well for when I come to undervolt the card properly. Bearing in mind that these are quick, rough numbers because I don't want to take the time to do properly controlled tests at the moment...

I dusted off Kingdom Come - Deliverance, since that's a graphically intensive game, and stood still in a random spot in the countryside. I left it for about 20 minutes to warm up before doing anything, then used WattMan to get the average and peak readings over 10 minutes at each setting. FPS is measured with the in-game counter, and power with my plug-in AC power meter.

First row is the default, second is underclocked and undervolted.

Code:
Set Frequency   Set Voltage    Fan Speed (avg/peak)  Junction Temp (avg/peak)  FPS    AC Power
1801MHz         1071mV         2911/2938rpm          109/112°C                 45     368W
1701MHz         970mV          1721/2912rpm          99/109°C                  44     318W

That's a pretty big improvement for an insignificant drop in performance. The difference in performance is small because the actual clock speed wasn't much different between the two.

Although the peak fan speed when undervolted was still nearly 3K, it only ramped up to that speed once. The rest of the time it was at a constant, and dare I say quiet, speed.

Thanks for the data! It basically goes along with what we have seen from Polaris and vega cards. Stock voltage is way too high. Quick settings change, and you go from loud to quiet.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
Traditionally a process node advantage like this would have sent the ATI card through the roof. It would have been a wrecking ball to Nvidia's performance. The problem here seems to be that this is a Vega card and Vega sucks. The reason for Vega sucking is precisely the same reason that I have no faith in Intel's future GPU's either. They come from the same source. Vega and Intel's new gaming GPU are two different smells coming from the same hot trash can that's been baking in the sun all day. Jokes aside, is there any reason to expect that Raja will product a better GPU than Vega now that he has some money working for him?
 
Feb 4, 2009
35,862
17,402
136
Meanwhile, the card is sold out even at AMD.com, and they are selling for $999+ on eBay. Hmm! Oh well.

I know what’s the deal with everyone still over paying for cards. $1K for a video card that has a high probability of crapping out within 3 years is insane.
Are most still being sold for mining?
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
I know what’s the deal with everyone still over paying for cards. $1K for a video card that has a high probability of crapping out within 3 years is insane.
Are most still being sold for mining?

The odds are that this SKU is incredibly limited production, I know of at least one fairly major store that got a grand total of two. Not per store, for the chain.

So it's presumably intended as somewhat of a Halo card (first 16GB 'gaming' card?), and a feather in the cap for old RTG while Navi bakes. It's also possible that the shared die with FirePro or whatever their professional brand is made it a pretty easy thing to plop out. It's a shame about the drivers so far though.

For a handful of pro applications (looked like about a 50/50 split according to the Linus vid) it can run with the 2080ti or surpass it in a few things (while of course getting smashed in many others). If you need a workstation that mainly does one of the things it does well in, I could see it being a relative bargain assuming it lasts the duty cycle you expect. Given how many very nice medium/high end workstations I see cloud eBay after about 2-3 years old, the overlap in pros who only use units for that time frame makes sense.

I do think for gamers it's kind of dumb. A minority of titles really match well, and once again the unbelievably irritating inclusion of Pascal FE models at the exclusion of the vastly more common and often dramatically faster AIBs is the rule of all the comparisons I've seen. It's just something that bothers me more and more, in an apparent desire to make both RTX and Vega 7 look better than they actually are compared to what many people may actually have bought over the past couple of years. Same thing happened with Vega launch, comparison to FEs instead of contemporary 11GBPS AIBs.