Well if the gains are that small from 210-295, then it really makes you wonder why on earth they're shipping one version consuming ~350w.....
1 test in 1 game. It requires a lot of thorough testing to determine overall power usage and performance characteristics. It's certainly intriguing though.Well if the gains are that small from 210-295, then it really makes you wonder why on earth they're shipping one version consuming ~350w.....
You really need a lot of talent to do this, 73% more transistors than GP104 (GTX1080) to get the same performance in gaming after more than a year and almost double the TDP.
Really its mind boggling how they managed to do this while having both 14nm FF + HBM2 at their disposal.
Yay for stagnating game development APIs! Aren't they fantastic?It's actually amazing how poorly their strategy played out and big props to nvidia for countering it.
Yay for stagnating game development APIs! Aren't they fantastic?
This is what I'm interested in... I've been holding off upgrading my release GTX 970... I also want to replace my monitor at some point and want *sync. So it will likely be either Vega or Volta probably. If I don't like Vega results enough I may as well buy a GSync monitor now!!I want to see more 1070 vs Vega 56 benchmarks....if Vega 56 ends up a better performer than a 1070 then that's the real story here. 1070+ performance is enough GPU power for 99% of gamers.
You really need a lot of talent to do this, 73% more transistors than GP104 (GTX1080) to get the same performance in gaming after more than a year and almost double the TDP.
Really its mind boggling how they managed to do this while having both 14nm FF + HBM2 at their disposal.
I want to see more 1070 vs Vega 56 benchmarks....if Vega 56 ends up a better performer than a 1070 then that's the real story here. 1070+ performance is enough GPU power for 99% of gamers.
Maybe it might make nVIDIA push the price down the 1070s and make room for a 1070Ti (spec'd like the mobile version + clocks). Vega 56 is looking like a solid card for the price, power and performance.
Remember how Fury X had minimal gain over the regular Fury?
Also there was someone saying Vega 56 is faster than GTX 1070 and rather close to GTX 1070. That may mean Vega 64 and higher versions don't offer much gains - just like with Fury.
You can't blame any one factor for it. They must have been scrambling. Like when you are panicking and everything is a blur and nothing works out?
They must have seriously miscalculated at some point. Project screw-ups plus probably assuming GTX 1080 would have been Nvidia's top chip. If it was, they'd be better off. Then they get a Ti out. Meaning Nvidia has room to cut 1080 prices if they want to. And even if AMD matches 1080 they won't be shoulder to shoulder with a halo product.
Here's to hoping the bottom falls out of the cryptocurrency market soon.
Vega is the architecture that has to compete with: GTX 1080, GTX 1080 Ti, and GV100 in different markets.I think AMD did what they did with VEGA on purpose. They built the card for Gaming "And" Pro space probably because they had to save money. So what we will get is a card that games well and does some great things in the pro space. It will not unseat NVDA gaming GPU's, Vega is a multipurpose card. AMD is "Smartly" putting alot of emphasis in the workstation and server world with Vega. Its smart because AMD has never made a dime on gaming GPU's in the past. Now with AI and the data center they see an opportunity to "Make money" and that is what its all about. Vega will succeed due to attach rate, when AMD sells a CPU over 50% of the time an AMD GPU is in the build -- the only issue was in the past nobody was buying AMD CPU's.
Well, with a couple of tweaks to a Monero miner and a constant underclock of 1.3GHz, a Vega Frontier Edition card hits 1.16kH/s. That is 34% faster than a GeForce GTX 1080 Ti, and 43% faster than a single Radeon RX 580 8GB. If the gaming-focused version of Vega, Vega RX, hits those same numbers at stock frequencies and is priced to compete with the GeForce GTX 1080 (which only hits 0.48kH/s), then we are likely going to experience another shortage of high-end GPUs. Mining with it is almost twice as profitable as a GTX 1080 on Monero, and the card would pay for itself in about half the time as well if priced to sell at $499.
Can't be... AMD marketing only does minimum framerates these days. 😉I don't think it is fabrication, but those are pretty much the most favorable AMD cherry picked benchmarks. These are the ones AMD chooses for marketing, because GCN cards fly on them. So it is almost certainly an AMD "leak" for marketing purposes.