News AMD Announces Radeon VII

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mopetar

Diamond Member
Jan 31, 2011
4,274
214
126
What are you talking about? Vega 64 beats the GTX 1080 by 1% overall.
If this were a $400 or even $500 card I might be much more interested or at least positive. It doesn't matter that it beats the 1080 (did you mean 2080?) by 1%, because it's not worth it for what you have to pay ($700) to do that.

Sure you can say that it's cheaper than the 2080, but the 2080 is a terrible value for $800. If you compare this card to a Vega 64, you end up paying 40% more (maybe even worse since someone earlier pointed out that you can actually buy a Vega 64 for $400 now) for a 30% performance bump on average.

This is a Vega 20 design, so mostly the same design, but with some tweaks and improvements obviously. This also seems like a cutdown version and its possible that they will release a higher end model sometime in the future, in fact Lisa Su said as much in the presentation.
I doubt that we'll see such a card. Once Navi is ready, AMD is going to focus on that. We've already seen the Vega as an architecture just isn't that good at gaming and that using HBM2 ensures that AMD can't price that card as reasonably as it should.

I doubt that AMD expects to sell terribly many of these and once Navi launches they'll have a mainstream card that consumers can actually afford.

Anyone who bashes this new AMD card is smoking crack. There is so much fake news and sock puppet accounts, you can't trust anyone.
Buy a Vega 64 and you'll have something that's just as effective while saving $300.

Or wait another half of a year for Navi cards to become available and you'll have something more than good enough at a significantly reduced cost.

I imagine that AMD will probably offer some bundles since Navi and Zen 2 will have a similar time frame for their launch. Then you can snag a new CPU as well, because unlike this Vega card, Zen 2 is absolutely something to be excited about.
 
Oct 6, 2016
43
11
41
I normally would never spend 1k on a CPU, especially an extreme brand, but I was upgrading from a DDR-800 system to DDR-1600.

I spent 1k more in mind as a long term investment. Everyone builds systems at different times, but overall things look good for me.
 
Oct 6, 2016
43
11
41
Buy a Vega 64 and you'll have something that's just as effective while saving $300.

Or wait another half of a year for Navi cards to become available
The major problem for me is I desperately need something now this very second to enjoy Quake Champions. I was about to pull the trigger and get the RTX 2080 > it's a 320% performance boost than my current video card. I can't wait 6 months unfortunately and Radeon VII will give me that 300+% Boost as well and can't wait 6 months
 

Mopetar

Diamond Member
Jan 31, 2011
4,274
214
126
The major problem for me is I desperately need something now this very second to enjoy Quake Champions.
The Vega card isn't available until early February, but otherwise enjoy your card. At least the 16 GB of memory ensures that it shouldn't have any problems on that front anytime soon.
 

railven

Diamond Member
Mar 25, 2010
6,305
19
126
But now AMD delivers the same perf/$, we suddenly we get excuses that production costs matter.
Saw announcement on main page, with price. Quickly jumped to forums to see how it's spun. Not disappointed.

Not bad, but I was really hoping they'd really put Nvidia's balls in a vice with a $500 price point if only for the sake of market share.
Only thing I really care about. Not surprised at all AMD didn't want to play hard ball. More so curious now if NV responds. Considering the costs of everything, I wonder if NV dropping RTX 2080 to $500-600 range now that most of GTX 1080 stock is dried up would hurt AMD. I'm all for a price war! (Unfortunately my performance bracket won't see any of the action :( ).

Don't know what your expecting. AMD/Nvidia fan(atic)s are both human and subject to the same character flaws. Just wait till Intel fans get thrown into the mix. We'll beable to forge the Triforce of hypocrisy.
Ah hell yeah! 2020! I'm ready! DO IT INTEL! Get me off this Nvidia branded life raft this rat stowed away on during the S.S. Radeon's sinking.
 

Headfoot

Diamond Member
Feb 28, 2008
4,379
27
126
The 1080 ti was $700 two years ago. So AMD is bringing us 1080 ti levels of performance two years later for... $700. "But it has more memory!" Meh.
Yeah, my 1080 Ti purchase near launch is starting to look like one of the better GPU purchases I've done
 
Oct 6, 2016
43
11
41
The Vega card isn't available until early February, but otherwise enjoy your card. At least the 16 GB of memory ensures that it shouldn't have any problems on that front anytime soon.
I can wait a month. The 32" display I have my eyes on is not out yet too and looks like it's designed perfectly with this card. I'm also going to see a huge difference in Quake & UT going from 60hz to 144hz display. This is the only reason why I'm doing this upgrade, otherwise I don't play games like I did 15 years ago.

ROG Strix XG32VQR Curved HDR Gaming Monitor – 32 inch WQHD (2560x1440), 144Hz, FreeSync™ 2 HDR, DisplayHDR™ 400, DCI-P3 94%, Shadow Boost


https://www.asus.com/Monitors/ROG-STRIX-XG32VQR/


>
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
5,695
6
126
One of the reasons why I haven't upgraded to the RTX series yet is because I was kind of disappointed with the raw 4K performance of the GPUs. I think NVidia prematurely invested in ray tracing, and should have prioritized mastering 4K HDR performance instead. That's exactly what AMD seems to be doing right now. As I said before, despite the Radeon VII having the same fundamental architecture as the Vega series, the performance increase is very substantial, which is primarily due to the doubled ROP count and huge bandwidth increase. I could imagine what would happen if AMD produced a GPU with next generation architecture, plus the backend enhancements of the Radeon VII.

That would be a mighty 4K HDR solution indeed! :cool:
 
Oct 6, 2016
43
11
41
It seems like this is for me and people in my situation. I've been out of the loop and using 1920x1200 60hz display 10 years now.

I remember back in the day with boob-tube displays and Viewsonic was the monitor to have with high hz for graphic designers, but when I upgraded to an LCD w/ HDMI, I was locked in right away for the full HD 1920x1200 and completely ignored anything about 1920x1080 - There is no possible way I will downgrade my display and have Left & Right chopped off and really need the 144hz to play Quake.

Everyone upgrades and builds at different periods of time, but this seems like the best upgrade for me till 4K is perfected before I do a next major upgrade. So excited for this !!
 
Mar 11, 2004
17,566
131
126
Well I was definitely wrong on my assumptions. I'm baffled. AMD outright said they kept Vega 64's rendering stuff the same with Vega 20, so where they double ROPs comes from I don't know. But I saw one of the Anandtech guys saying that AMD had told him they double ROPs, but that also makes little sense to me since Vega 20 was supposedly enterprise focused. But then they're apparently using these in some of the game streaming enterprise solutions, so I guess that makes sense (but doesn't make sense that was basically the only way they addressed that). For me it would make more sense to buy up Vega 64 cards that were made for mining.

Granted a lot of my takes (Navi being ready now) were based on rumor (and I tried to make that clear), so I'm not really disappointed that didn't pan out like I thought. But this card seems to deliberately refute things AMD has said, so now even what they say has to be taken with a grain of salt.

The silliest part is that, its not like this card is really upsetting Nvidia's market strategy, so there really was little to no reason to keep it under wraps and say Vega 20 would not be for gamers (which they didn't make this a Frontier Edition card, so either they're taking the total opposite track of Nvidia and claiming that above this is prosumer level, or this is a gaming card). I don't see that they likely would've lost Vega 64 sales (if anything, they should've announced this thing ASAP to steal some thunder from Nvidia's RTX), but then I wonder if the only reason we're seeing this is because the RTX stuff didn't go over as well - largely because of the pricing - which made AMD feel like they could offer this as a competitive product. Well that and either the yields on 7nm are already that good or its flopping hard in enterprise (which I'd think they'd just drop the price, or shift it to prosumer where they could probably sell it for double or at least $1000; maybe offer special pricing packages for Threadripper workstations that use it).

I really hope this isn't an indication that Navi is that far off.

One of the reasons why I haven't upgraded to the RTX series yet is because I was kind of disappointed with the raw 4K performance of the GPUs. I think NVidia prematurely invested in ray tracing, and should have prioritized mastering 4K HDR performance instead. That's exactly what AMD seems to be doing right now. As I said before, despite the Radeon VII having the same fundamental architecture as the Vega series, the performance increase is very substantial, which is primarily due to the doubled ROP count and huge bandwidth increase. I could imagine what would happen if AMD produced a GPU with next generation architecture, plus the backend enhancements of the Radeon VII.

That would be a mighty 4K HDR solution indeed! :cool:
Did other performance figures get leaked? (Just saw that there are some being discussed.) The only thing I've seen touted its performance via AMD compared to the 2080 (which is likely best case scenario), and while its definitely a step up over Vege 64, its honestly not that impressive, and considering the price, and I have serious questions if AMD is making money selling the card at that price (since 7nm and more HBM channels and stacks), I just have to wonder why they're doing this. Its inline with what I was thinking and to me isn't that good. I'm pleasantly surprised that they apparently can get it down to that price, as I was thinking it'd cost them even more, but even for that price I'm not impressed.

Not only that, but I'm not at all impressed with the efficiency. That to me indicates the issues with Vega were not related to GF's process much (that Zen 2 is seeing such a big boost in efficiency tells me that 7nm seems to be living up to its potential quite a bit already). Same with the clock speeds. Vega seems like an inherently flawed architecture, one that seemingly prioritized compute capability. Which that would be fine, but its rendering/rasterizing capability suffered mightily for it, and I'm not sure that compute outstripped just doubling Polaris and adding support for further reduced precision math. We can be fairly certain that gaming performance didn't benefit versus that, and if they would've done 96 ROPs, it'd probably be more balanced. On top of that, just doubling the memory channels would've probably made it cheaper and it'd have had more bandwidth than HBM2 solution had.

Guess we'll see. People have speculated a lot that Vega was bottlenecked in ROPs and/or memory bandwidth. But if things were that severe, I'd expect a bigger improvement when doubling both. Seems like it will still be unbalanced, just in the opposite direction now or in some other manner (unless its still ROP and/or memory bandwidth bottlenecked in which case...WTH).

I really hope AMD was getting access to some of the data that Microsoft used to decide on the One X setup, as that seems like a well balanced and efficient design. Which that would seem to support that AMD's dGPUs tend to be bandwidth limited some, but I believe its actually the TMUs that was boosted relative to Polaris, while the ROPs were the same. And I believe AMD has said they don't believe ROPs is a limiting factor in the past (so I wouldn't be surprised if most of Vega 20's performance increase comes simply from the memory).

Its a shame that I think GPU IP has been pretty staunchly locked up, as I feel like this market could use a boost from companies trying some similar ideas but just trying different ratios or other things. I also feel like right now that AMD and Nvidia are both trying to figure out next steps, where they're trying to determine specialization versus integration. While they're also dealing with major engineering challenges, and trying to balance product portfolios, where they're trying to design GPUs for such a wide breadth of markets that I can't help but feel like we're getting a lot things that aren't well suited for their markets.
 
Last edited:

mohit9206

Senior member
Jul 2, 2013
976
40
116
Yeah, my 1080 Ti purchase near launch is starting to look like one of the better GPU purchases I've done
Depends on how much you paid for your 1080Ti. Most good aftermarket cards were above $800.
 
Oct 6, 2016
43
11
41
Everything here seems screwy and wrong. It seems we are getting bits and pieces of Enterprise stuff that are hand-me-downs.

From Sandy Bridge E > to Ivy Bridge. Ivy is has more enterprise bandwidth and power efficiency. It seems Vega 64 to Radeon VII is the same exact pattern here.

What really baffles me is why after Ivy Bridge did Intel gimp the the lanes from 40 to all these screwy lanes to 16 and 2 Channel Memory...it's a mess.

On top of that, I'm still running Windows 7 Ultimate-64. Microsoft is forcing everyone to Windows 8-10 while new games still in development like Quake and Unreal Tournament are running DirectX 11 ??? That was the whole point of forcing us off Windows 7. This all sucks.

>

When does Anandtech get these cards ? I'm sure they probably have it now and signed NDA's...etc Will we wait for reviews soon before February or when it's released on 7th ? I can see AMD giving these cards out to certain people during this week at CES - Then you have 3 weeks to play with the card and test it out and have a review ready the next week by time it's released
 
Last edited:

krumme

Diamond Member
Oct 9, 2009
5,638
35
136
We know iPhone sales is lacking. The demand for the high-end 7nm nodes is probably not as strong as expected.
Amd might get 7nm capacity cheaper now. Tsmc is not blind. It's either: you lower price for this to be profitable or we don't care.
Or they simply don't sell as many b2b cards as predicted.
Another point. I am not so sure those hbm2 modules is longer so expensive as we think they are. Again. Amd can simply ask: do you want to sell this or not.
No way is amd is bringing this to market at a loss. They are not that profitable yet. They don't have the luxury to take such moves.
 
Oct 6, 2016
43
11
41
Things are going to heat up. If 16GB is being dropped , we are going to see 32GB & 1200w Power Supply to really fully enjoy 4K Gaming in it's glory

1200w is like a friggin Hair dryer constantly running or growing some Ganja in 12 hour or 24 hour cycle
 

amenx

Platinum Member
Dec 17, 2004
2,339
26
126
"Benches @ 4k. CPU differences will be negligible at that res."
Wrong , Look at Hitman 2 , lowest percentage is 7.5%
Let me rephrase that. In general, 4k perf will have negligible differences between different CPUs. In rare instances, the odd game or so may crop up to show more appreciable differences, but is not representative of the bigger picture.

You do know that theres a reason most CPU benches exclude 4k gaming performance dont you?
 

PhonakV30

Senior member
Oct 26, 2009
931
11
136
"Benches @ 4k. CPU differences will be negligible at that res."

Let me rephrase that. In general, 4k perf will have negligible differences between different CPUs. In rare instances, the odd game or so may crop up to show more appreciable differences, but is not representative of the bigger picture.

You do know that theres a reason most CPU benches exclude 4k gaming performance dont you?
If I'm not mistaken Test was done in 1440p not 4K.
 
Feb 2, 2009
12,774
75
126
We have to give it to AMD, with Vega II at $699 they raised vfm of RTX 2080 (same price + DXR). Its good to see the competitor giving a helping hand.
 

Veradun

Senior member
Jul 29, 2016
243
8
86
We have to give it to AMD, with Vega II at $699 they raised vfm of RTX 2080 (same price + DXR). Its good to see the competitor giving a helping hand.
Same performance, no space invaders, double the ram. It makes the 2080 the most useless and laughable of the RTX lineup :>
 

Mopetar

Diamond Member
Jan 31, 2011
4,274
214
126
We have to give it to AMD, with Vega II at $699 they raised vfm of RTX 2080 (same price + DXR). Its good to see the competitor giving a helping hand.
That's a deeply flawed way of looking at things. If it actually worked this way, Ferarri would ask Ford to sell a Pinto for $225,000 since that would magically raise the value for money of the 488.

The reality is that both cards are still a terrible value. An ugly girl doesn't become pretty just because you put an even uglier girl next to her. If you think otherwise, you're only likely to be disappointed come the next morning.
 

nOOky

Golden Member
Aug 17, 2004
1,157
13
106
I can wait a month. The 32" display I have my eyes on is not out yet too and looks like it's designed perfectly with this card. I'm also going to see a huge difference in Quake & UT going from 60hz to 144hz display. This is the only reason why I'm doing this upgrade, otherwise I don't play games like I did 15 years ago.

ROG Strix XG32VQR Curved HDR Gaming Monitor – 32 inch WQHD (2560x1440), 144Hz, FreeSync™ 2 HDR, DisplayHDR™ 400, DCI-P3 94%, Shadow Boost


https://www.asus.com/Monitors/ROG-STRIX-XG32VQR/


>
Looks exactly like the AOC Agon and Benq monitors already available, I am interested, but I don't think that resolution for 32" would be very good for Windows and productivity also.
 
Oct 6, 2016
43
11
41
Looks exactly like the AOC Agon and Benq monitors already available, I am interested, but I don't think that resolution for 32" would be very good for Windows and productivity also.
Yea, that's one of my concerns based on tons of research, especially the dpi vs ratio for 32" vs 27" - The selling points for me is it boasts the FreeSync 2 - HDR 400 and states Buttery Smooth Graphics.... Oh man, that sounds promising but I'm still using Windows 7 and not sure what advantages I'll be missing and at some point will need to dual boot with Win 10.

Currently I have 2 - 24" Displays, both 1920x1200 with professional - WALI Dual LCD Monitor Fully Adjustable Desk Mount for 24" and everything is just beautiful.

I may have to take a chance and plan on keeping one of the 24" on my desk. It's just going to be really weird having to move or shift monitors if I need the real quality or productivity for Photoshop or something.
 
Mar 11, 2004
17,566
131
126
Everything here seems screwy and wrong. It seems we are getting bits and pieces of Enterprise stuff that are hand-me-downs.

From Sandy Bridge E > to Ivy Bridge. Ivy is has more enterprise bandwidth and power efficiency. It seems Vega 64 to Radeon VII is the same exact pattern here.

What really baffles me is why after Ivy Bridge did Intel gimp the the lanes from 40 to all these screwy lanes to 16 and 2 Channel Memory...it's a mess.

On top of that, I'm still running Windows 7 Ultimate-64. Microsoft is forcing everyone to Windows 8-10 while new games still in development like Quake and Unreal Tournament are running DirectX 11 ??? That was the whole point of forcing us off Windows 7. This all sucks.

>

When does Anandtech get these cards ? I'm sure they probably have it now and signed NDA's...etc Will we wait for reviews soon before February or when it's released on 7th ? I can see AMD giving these cards out to certain people during this week at CES - Then you have 3 weeks to play with the card and test it out and have a review ready the next week by time it's released
That's because that's where the money is. The consumer PC market has been slowing down due to the rise of mobile devices. And things are moving towards services and the cloud, so things will continue in that direction.

I don't think Microsoft wants people on Windows 8 either. They want as many people on 10, as it simplifies support and development for them. DX12 was not the reason (they could have easily had DX12 on Win7), they just used it as a way of trying to get gamers to move to 10. Its because DX11 has wide compatibility and companies haven't really built engines from the ground up for DX12/Vulkan. DX9 was still prevalent like 5 years ago, and it came out in 2002. We'll see if DX11 has that longevity. I actually think the slow adoption of DX12 in a meaningful way is why Microsoft decided to try pushing ray-tracing, trying to get more support for it.

Yeah the review cards are probably on their way or will be soon, and they'll get probably a couple of weeks to test them before the launch when they'll be able to release their findings.

We know iPhone sales is lacking. The demand for the high-end 7nm nodes is probably not as strong as expected.
Amd might get 7nm capacity cheaper now. Tsmc is not blind. It's either: you lower price for this to be profitable or we don't care.
Or they simply don't sell as many b2b cards as predicted.
Another point. I am not so sure those hbm2 modules is longer so expensive as we think they are. Again. Amd can simply ask: do you want to sell this or not.
No way is amd is bringing this to market at a loss. They are not that profitable yet. They don't have the luxury to take such moves.
They're not lacking because of anything to do with 7nm, its more that Apple is trying to boost their margins so they increased the prices. Plus, Apple likely already produced those chips and few other 7nm chips are out, so declaring demand to already be low is premature to say the least.

Doubt that, since TSMC knows they're AMD's only option for that now, and many of their deals were worked in advance of any of this.

No, they're still expensive, especially since GDDR6 is I think marginally more expensive than GDDR5 was. The interposer costs likely didn't go down compared to Vega 10 (likely the opposite since its supporting twice the channels). They're not so costly that AMD can't make products like this, but it'll eat into their margins for sure. But its not like they could've slapped GDDR6 on there as that'd be too costly to rework the chip for that.

I would hope not, but I also doubt there's much room to drop price either. I think this is just an opportunistic product, where yields and/or demand of Vega 20 in enterprise space has made it so AMD had extra chips, and Nvidia's pricing made it so that AMD could throw gamers something to tide them over (and it should still beat Navi, although I doubt it'll be by too terribly much, and especially considering its nearly 3x the expected price of Navi, you'd be paying a high premium; and now with Nvidia supporting non-GSynce adaptive sync displays, you're not locked into AMD GPU if you have a Freesync display). I'd be really interested in how many sales they do get from this.

I just hope that Navi is coming sooner than later and that its at least Vega 64 level of performance for less than $300. That's the market me and most of the people I know are in.

Same performance, no space invaders, double the ram. It makes the 2080 the most useless and laughable of the RTX lineup :>
Is it the same performance (isn't it averaging ~10% less performance)? I'm not sure how much of a concern that is (we don't know if Vega VII might have issues of its own - I seem to recall some early Vega 64 cards getting memory corruption because the HBM stacks were different height and the heastink was flas so it wasn't making good contact; might be wrong that it was memory corruption but there were some issues with regards to the memory and heastinks).

The extra RAM and bandwidth is definitely nice, and alone I think would be worth it, but I don't value RTX at all right now myself. I don't really agree with that. Its nice there is an option if you have that much to spend but are turned off by Nvidia's tactics. The thing is, I'm not sure its a good message to go "eff you and your prices" and then go spend that on an equally mediocre perf/$ product from their competitor.

On top of that, Vega 20 might actually be worse in perf/W than RTX despite being on a superior process (although have a hunch you'll be able to manually adjust voltages down and improve things a good amount, but speaking of stock; I haven't seen if RTX owners could do similar too).

Unless Vega VII lets them enable their NGG Fastpath (not gonna happen, AMD has said they stopped developing that stuff) and massively increases their geometry throughput, its an ok card in the current market, but nothing more. It doesn't meaningfully push any parameter (outright perf, perf/$, perf/w), and its expensive. To me, its a mediocre product. Certainly, some people can make the case for buying it, but I can't and wouldn't even if I could.
 

Timmah!

Senior member
Jul 24, 2010
726
4
91
Not an ouch.. a 2080 is $799, only has 8GB of Memory (GDDR6) - half what this has, and half the memory bandwidth (448GB/sec)... vs 1Tb/sec on AMD.

This highlights what a rip-off nvidia has put fwd.

So the memory capacity and bandwith are nowadays the most important metrics?

Not defending Nvidia pricing, but even with "inferior" hardware tech, 2080 is no worse than this Radeon. All that big and fast memory is indeed nice, but then you want to get it to use and run stuff like 3D Mark Port Royal, Octane, Redshift, Tensorflow...and uh-oh, you cant. Surely that kind of balances things out.

BTW i wonder if the card retains the same FP64 capabilities as that MI50 card. Somehow i doubt it, but If yes, the card would double down as both decent gaming and awesome HPC product on cheap.
 


ASK THE COMMUNITY