News AMD Announces Radeon VII

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Veradun

Senior member
Jul 29, 2016
406
317
106
Can someone confirm/deny 128rops?That will be HUUUGE.
With 128rops there will be 8x shader engines and also +100% geometry speed vs vega 64 at same clock.4x shader engines was big bottlenek for amd cards since hawaii...

https://www.anandtech.com/show/13832/amd-radeon-vii-high-end-7nm-february-7th-for-699
It's confirmed if it's being reported. They have a blurb in the article about it so it's not just a table mistake.
the 128 ROPs really caught me surprise as I thought GCN had a hard limit of 64/CU and 64 ROPS / 4096 Shaders
Looking at the high level diagrams it seems they have the same four clusters per pipeline and those clusters have 4 ROPs each since forever (since R600, the first terascale architecture).

https://i.postimg.cc/vByxh61z/Vega10.png
https://i.postimg.cc/Kz5kHvBy/Vega20.png

I still looking for confirmation as 128 doesn't sound right.

an 8GB version would have half the memory bandwidth.
They could have used 2Hi stacks, requiring new validation, a new line of memory production. Not worth the effort tbh.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
15,742
4,704
136
Who would know? They just wouldn't release those benches, or they'd only release the ones that were successful.
Plus, they didn't test it with the current Zen CPU, either.
Good benchers do at least 3 runs and show either an average (3 runs) or a median (5-10 runs or more). Running on an ES would possibly produce buggy behavior, outlier results . . . I wouldn't do it.

2700x would be an interesting choice, but again, the goal is to try to reduce the effects of CPU bottlenecks on GPU tests.
 

Carfax83

Diamond Member
Nov 1, 2010
5,880
564
126
Someone @ Guru3d (HWgeek) says he saw more benches and posted this chart/summary:

These look the same as the official benchmarks from AMD. Here's a bar chart with the percentage increases. Fallout 76 gains the most, at a whopping 68% o_O

I wonder what on Earth it is about that game that could result in such a large performance increase on Vega 20? Must have been ROP bound :confused:

I can't deny though that these are some nice increases across the board, despite that it's still the same fundamental architecture.

 
  • Like
Reactions: Despoiler

Timorous

Senior member
Oct 27, 2008
384
148
116
So, does this feel like what would have normally been a Radeon Pro being rebranded/marketed because Nvidia gave them an opportunity by pricing the RTX 2080 at $700-800?

FP64 support, 16 GB HBM2. Feels very much like a way to sell excess MI50s.
 
Last edited:
  • Like
Reactions: darkswordsman17

linkgoron

Golden Member
Mar 9, 2005
1,916
230
106
These look the same as the official benchmarks from AMD. Here's a bar chart with the percentage increases. Fallout 76 gains the most, at a whopping 68% o_O

I wonder what on Earth it is about that game that could result in such a large performance increase on Vega 20? Must have been ROP bound :confused:

I can't deny though that these are some nice increases across the board, despite that it's still the same fundamental architecture.

Those numbers add up to an average of ~33% - which would place the Vega 7 at ~1080ti level and not 2080 level (which according to TPU is about 10% faster).
 
  • Like
Reactions: darkswordsman17

jpiniero

Diamond Member
Oct 1, 2010
7,788
1,114
126
From AT's 2080 review:

Shadow of War:
V64 42.1
2080 Stock 56.9
2080 Ti Stock 71.2

Wolf 2:
V64 69.4
2080 Stock 95.8
2080 Ti 121.9

AT used a 7820X @ 4.3 however instead of the 7700K mentioned.
 

TypoFairy©

Member
Jul 29, 2003
77
36
91
So, does this feel like what would have normally been a Radeon Pro being rebranded/marketed because Nvidia gave them an opportunity by pricing the RTX 2080 at $700-800?

FP64 support, 16 GB HBM2. Feels very much like a way to sell excess MI50s.
Sorta, TBH if they called it a frontier edition it just might have been received abit better lol. Since it will probably be a compute beast.
 

DrMrLordX

Lifer
Apr 27, 2000
15,742
4,704
136
Sorta, TBH if they called it a frontier edition it just might have been received abit better lol. Since it will probably be a compute beast.
FE means getting the hybrid Pro/gamer driver stack. Which uh. Doesn't work that well, unless you just load the Pro drivers by default and leave the gaming drivers alone.
 

EXCellR8

Diamond Member
Sep 1, 2010
3,683
646
126
Anyone else wondering if there will be a "gaming" version with all of the CU cores unlocked? I kind of hope so but that would mean a steeper price.
 

TypoFairy©

Member
Jul 29, 2003
77
36
91
Anyone else wondering if there will be a "gaming" version with all of the CU cores unlocked? I kind of hope so but that would mean a steeper price.
At this point I rather just see Navi come and replace the entire product stack, but if we're lucky maybe someone will software unlock some to their full glory. Like in the old days with the 9500->9700 or 9800se->9800 pro . :cool:
 
  • Like
Reactions: darkswordsman17

EXCellR8

Diamond Member
Sep 1, 2010
3,683
646
126
I would think that the next product stacks are better reserved for PCIe 4.0, since it's on the horizon but isn't implemented yet. This might be all we get for now... just hope Polaris is done.
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
21,704
1,661
136
So last Christmas for $699 I got a Powercolor Red Devil RX Vega 56 that in some games, like Grand Theft Auto V is equivalent to the 1070 FE and in others like Tom Clancy’s The Division is equivalent to the 2070 FE

Now for $699 we’ll be able to get:
  • 2X the Ram 8GB to 16GB
  • 2.5X the memory bandwidth 409 to 1024GB/s
  • 2X ROPs 64 to128
  • 4 more CUs 56 to 60
  • 210 more peak MHz 1590 to 1800mhz

That’s not a bad upgrade. The only thing we don’t know is power consumption.

My card averages about 215W. 4 more CUs at 210 more megahertz is about 25% more theoretical performance and power. Double the 30w 8GB HBM2 2048 memory and a theoretical VII would use about 305W.

However that is before any 7nm power savings. So I’ll put it between 250-300w depending on how hard they push voltage vs performance curve.

The other thing I’ll say is man did I pick the wrong time to build a new rig last year. :confused_old:
You did indeed pick the wrong time. I picked up my Red Devil 56 for $300 after the mining bust. Its tank build quality and overclocking capabilities, make it a perfect match for the 32" 75 Hz 1440p Freesync monitor I bought to go with it.

This new card holds no interest for me, given my gaming needs. That said, it is good to see gamers that are unhappy with Nv's shenanigans, and looking to vote with their wallets, finally have a strong 4k alternative.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
100,016
3,042
126
This is only true if you use half the number of stacks. But it is perfectly possible to use the same number of stacks as with 16GB (and thus have the same bandwidth), but have each stack be half the capacity. This can be done by either using regular 2-Hi stacks or by using half density 4-Hi stacks.

Now whether or not anyone sells 2-Hi stacks or half density 4-Hi stacks at an attractive price for AMD is of course a separate question.
that's my concern - you're not going to save anything on the interposer and you're not going to save much on the stacks, so, shaving $200 off isn't a likely proposition.

They could have used 2Hi stacks, requiring new validation, a new line of memory production. Not worth the effort tbh.
exactly.
 

zinfamous

No Lifer
Jul 12, 2006
101,026
15,153
136
I love how he uses DLSS as a reason that 2080 is better. Because everybody wants to spend $700 on a GPU and then make their game look like crap by turning that "feature" on.
..and the raytracing. "There's no Ray Tracing!" Yeah, so that's good...because I don't want to have to revert to 2006 resolutions to get ~passable FPS in the hopes of gaining some shinier surfaces, and the generous $1k donation to the cause to achieve that?
 
  • Like
Reactions: Stuka87 and Arkaign

Guru

Senior member
May 5, 2017
783
282
106
I'm not particularly sure what there is to be excited about. It seems like the same old Vega just with a clock bump due to the new process node. We already know that Vega doesn't perform well as a gaming card and the results that AMD showed don't give the impression that this will be any different.

It looks like they just announced the price for this as $700, which isn't particularly compelling. Maybe it squares up nicely against a 2080, but we already know how overpriced that is. This does absolutely nothing to improve the performance/price ratio that most consumers care about.
What are you talking about? Vega 64 beats the GTX 1080 by 1% overall. Watch Hardware unboxed, they test a ton of games. Also in all of the newer games Vega 64 is faster than the 1080 and much faster in DX12 and Vulkan.

This is a Vega 20 design, so mostly the same design, but with some tweaks and improvements obviously. This also seems like a cutdown version and its possible that they will release a higher end model sometime in the future, in fact Lisa Su said as much in the presentation.

So I don't know what you are smoking, but it sure seems super strong, you should lay it off. Again the only games where GTX 1080 seems to win easily are much older games with older engine design like GTA 5, Metro Redux, etc... year 2015 games and older.

All of the newer games like Far Cry, Battlefield, Call of Duty, F1 2018, Kingdom Come deliverance, etc... AMD has the lead.

In terms of this announcement, its unexpected for sure. I thought AMD wouldn't be able to bring out new GPU's at least until April, but they've done it. I think they are likely pushing Ryzen 3000 desktop CPU's a bit further down the line in order to churn up some GPU's so they stay in competition with Nvidia. They are in a strong position with the Ryzen 2000 series CPU's, launching their 7nm Epyc server processors this month, so no need to rush the desktop versions, especially as they ramp up Epyc production and launch some new GPU's to compete with Nvidia.

I think a March, April Ryzen 3000 launch is what we are looking at, especially as Intel won't have anything new until late 2019.
 

Mopetar

Diamond Member
Jan 31, 2011
4,681
874
126
Now for $699 we’ll be able to get:
  • 2X the Ram 8GB to 16GB
  • 2.5X the memory bandwidth 409 to 1024GB/s
  • 2X ROPs 64 to128
  • 4 more CUs 56 to 60
  • 210 more peak MHz 1590 to 1800mhz

That’s not a bad upgrade. The only thing we don’t know is power consumption.
This thing is essentially a Radeon Instinct MI50 only with slightly higher (1800 MHz vs 1746 MHz) clocks. That card has a TDP of 300W, so I would imagine something similar.

Frankly, this card only exists because NVidia pushed their prices so high. Last year AMD had no plan to launch this as a consumer card- 7nm is still super early, yields aren't going to be great, and EUV isn't available yet (which should push down wafer costs). If they couldn't launch it at $700, they wouldn't be able to launch it at all.
There were some rumors circulating that even though AMD might announce Navi, their plans had shifted and there still was a Vega 20 part in the pipeline.

I'm not sure how much stock to put in that (given the similarities to the MI50), but the rumors said that the project for a consumer Vega 20 wasn't canceled soon enough so there was a limited production run of chips. This may just be some half-aborted project that gets pushed out into the market just to recoup costs.

If nothing else, it gives the AMD diehards something to fawn over, but I can't see this being a serious part of AMD's product portfolio. They just don't have anything until Navi and they can only refresh Polaris so many times.

These look the same as the official benchmarks from AMD. Here's a bar chart with the percentage increases. Fallout 76 gains the most, at a whopping 68% o_O

I wonder what on Earth it is about that game that could result in such a large performance increase on Vega 20? Must have been ROP bound :confused:
Who really knows. Bethesda makes terrible game engines so it's hard to say for sure. ROPs seem like the most likely explanation though.

Even though it's an impressive number, it's probably not worth talking about. I get the impression that more people would care about improvements in FPS for Tux Racer than for Fallout 76.
 

Thrashard

Member
Oct 6, 2016
140
28
71
Anyone who bashes this new AMD card is smoking crack. There is so much fake news and sock puppet accounts, you can't trust anyone.

Hello ? There are people like myself who have a 5yo system and regardless what news says, we are still living in a recession.

The Radeon VII is PERFECT for my 4960x Ivy Bridge Extreme w/ Quad Channel 64GB Memory on Asus x79-Deluxe board and 850w Power Supply. I could care less about speed, it's all about the bandwidth and performance features to give me the speed I need.

I just got back into Gaming and love the new Quake Champions, but I'm still using an old 1920x1200 HD Display. I desperately need a faster 144hz display and I feel absolute certain If I upgrade to 1440p display w/ Radeon VII will give me the perfect balance of FPS performance for my 4960X CPU setup.

I'm going all in and also get the ROG Strix XG32VQR Curved HDR Gaming 32" Monitor w/ FreeSync 2. This should be absolutely perfect for me to milk things out till year 2025. 8K gaming is around the corner while 4K is not even perfected yet.

Ray Tracing will be for 8K and totally worthless now. You want those fancy Ray Tracing, then you will have to shell out $$$ for uber DDR4 memory that will be worthless in 2025 when DDR6, PCIE-5 and 8K gaming will be out.
 
  • Like
Reactions: kawi6rr

LTC8K6

Lifer
Mar 10, 2004
28,523
1,568
126
Anyone who bashes this new AMD card is smoking crack. There is so much fake news and sock puppet accounts, you can't trust anyone.

Hello ? There are people like myself who have a 5yo system and regardless what news says, we are still living in a recession.

The Radeon VII is PERFECT for my 4960x Ivy Bridge Extreme w/ Quad Channel 64GB Memory on Asus x79-Deluxe board and 850w Power Supply. I could care less about speed, it's all about the bandwidth and performance features to give me the speed I need.

I just got back into Gaming and love the new Quake Champions, but I'm still using an old 1920x1200 HD Display. I desperately need a faster 144hz display and I feel absolute certain If I upgrade to 1440p display w/ Radeon VII will give me the perfect balance of FPS performance for my 4960X CPU setup.

I'm going all in and also get the ROG Strix XG32VQR Curved HDR Gaming 32" Monitor w/ FreeSync 2. This should be absolutely perfect for me to milk things out till year 2025. 8K gaming is around the corner while 4K is not even perfected yet.

Ray Tracing will be for 8K and totally worthless now. You want those fancy Ray Tracing, then you will have to shell out $$$ for uber DDR4 memory that will be worthless in 2025 when DDR6, PCIE-5 and 8K gaming will be out.
Well, you were willing to pay ~$1,000 just for a CPU in 2013, so I'm not sure you are mainstream.

Also, I suspect that Radeon 7 is pretty good at Ray tracing.
 

ASK THE COMMUNITY