Everything here seems screwy and wrong. It seems we are getting bits and pieces of Enterprise stuff that are hand-me-downs.
From Sandy Bridge E > to Ivy Bridge. Ivy is has more enterprise bandwidth and power efficiency. It seems Vega 64 to Radeon VII is the same exact pattern here.
What really baffles me is why after Ivy Bridge did Intel gimp the the lanes from 40 to all these screwy lanes to 16 and 2 Channel Memory...it's a mess.
On top of that, I'm still running Windows 7 Ultimate-64. Microsoft is forcing everyone to Windows 8-10 while new games still in development like Quake and Unreal Tournament are running DirectX 11 ??? That was the whole point of forcing us off Windows 7. This all sucks.
>
When does Anandtech get these cards ? I'm sure they probably have it now and signed NDA's...etc Will we wait for reviews soon before February or when it's released on 7th ? I can see AMD giving these cards out to certain people during this week at CES - Then you have 3 weeks to play with the card and test it out and have a review ready the next week by time it's released
That's because that's where the money is. The consumer PC market has been slowing down due to the rise of mobile devices. And things are moving towards services and the cloud, so things will continue in that direction.
I don't think Microsoft wants people on Windows 8 either. They want as many people on 10, as it simplifies support and development for them. DX12 was not the reason (they could have easily had DX12 on Win7), they just used it as a way of trying to get gamers to move to 10. Its because DX11 has wide compatibility and companies haven't really built engines from the ground up for DX12/Vulkan. DX9 was still prevalent like 5 years ago, and it came out in 2002. We'll see if DX11 has that longevity. I actually think the slow adoption of DX12 in a meaningful way is why Microsoft decided to try pushing ray-tracing, trying to get more support for it.
Yeah the review cards are probably on their way or will be soon, and they'll get probably a couple of weeks to test them before the launch when they'll be able to release their findings.
We know iPhone sales is lacking. The demand for the high-end 7nm nodes is probably not as strong as expected.
Amd might get 7nm capacity cheaper now. Tsmc is not blind. It's either: you lower price for this to be profitable or we don't care.
Or they simply don't sell as many b2b cards as predicted.
Another point. I am not so sure those hbm2 modules is longer so expensive as we think they are. Again. Amd can simply ask: do you want to sell this or not.
No way is amd is bringing this to market at a loss. They are not that profitable yet. They don't have the luxury to take such moves.
They're not lacking because of anything to do with 7nm, its more that Apple is trying to boost their margins so they increased the prices. Plus, Apple likely already produced those chips and few other 7nm chips are out, so declaring demand to already be low is premature to say the least.
Doubt that, since TSMC knows they're AMD's only option for that now, and many of their deals were worked in advance of any of this.
No, they're still expensive, especially since GDDR6 is I think marginally more expensive than GDDR5 was. The interposer costs likely didn't go down compared to Vega 10 (likely the opposite since its supporting twice the channels). They're not so costly that AMD can't make products like this, but it'll eat into their margins for sure. But its not like they could've slapped GDDR6 on there as that'd be too costly to rework the chip for that.
I would hope not, but I also doubt there's much room to drop price either. I think this is just an opportunistic product, where yields and/or demand of Vega 20 in enterprise space has made it so AMD had extra chips, and Nvidia's pricing made it so that AMD could throw gamers something to tide them over (and it should still beat Navi, although I doubt it'll be by too terribly much, and especially considering its nearly 3x the expected price of Navi, you'd be paying a high premium; and now with Nvidia supporting non-GSynce adaptive sync displays, you're not locked into AMD GPU if you have a Freesync display). I'd be really interested in how many sales they do get from this.
I just hope that Navi is coming sooner than later and that its at least Vega 64 level of performance for less than $300. That's the market me and most of the people I know are in.
Same performance, no space invaders, double the ram. It makes the 2080 the most useless and laughable of the RTX lineup :>
Is it the same performance (isn't it averaging ~10% less performance)? I'm not sure how much of a concern that is (we don't know if Vega VII might have issues of its own - I seem to recall some early Vega 64 cards getting memory corruption because the HBM stacks were different height and the heastink was flas so it wasn't making good contact; might be wrong that it was memory corruption but there were some issues with regards to the memory and heastinks).
The extra RAM and bandwidth is definitely nice, and alone I think would be worth it, but I don't value RTX at all right now myself. I don't really agree with that. Its nice there is an option if you have that much to spend but are turned off by Nvidia's tactics. The thing is, I'm not sure its a good message to go "eff you and your prices" and then go spend that on an equally mediocre perf/$ product from their competitor.
On top of that, Vega 20 might actually be worse in perf/W than RTX despite being on a superior process (although have a hunch you'll be able to manually adjust voltages down and improve things a good amount, but speaking of stock; I haven't seen if RTX owners could do similar too).
Unless Vega VII lets them enable their NGG Fastpath (not gonna happen, AMD has said they stopped developing that stuff) and massively increases their geometry throughput, its an ok card in the current market, but nothing more. It doesn't meaningfully push any parameter (outright perf, perf/$, perf/w), and its expensive. To me, its a mediocre product. Certainly, some people can make the case for buying it, but I can't and wouldn't even if I could.