PhonakV30
Senior member
- Oct 26, 2009
- 987
- 378
- 136
Power, shmower. Give me better than NVIDIA performance for less money: SOLD. I need to upgrade my GTX 970.
lol You're dreaming? then wait for 2019 !
Power, shmower. Give me better than NVIDIA performance for less money: SOLD. I need to upgrade my GTX 970.
Power consumption, it matters.
I agree. I mean, imagine custom Vega 56 models - those will probably be some huge cards, with 3x 90mm fans, occupying 2.5 or even 3 slots, with 250W TDP, price and perfromance of some custom GTX 1070 model. No one will buy such card when nVidia has much better offer like Zotac gtx-1070-amp-extreme with the same performance, same board size, same TDP and same price. OhThe power consumption is horrendous, especially for the reputed performance.
I agree. I mean, imagine custom Vega 56 models - those will probably be some huge cards, with 3x 90mm fans, occupying 2.5 or even 3 slots, with 250W TDP, price and perfromance of some custom GTX 1070 model. No one will buy such card when nVidia has much better offer like Zotac gtx-1070-amp-extreme with the same performance, same board size, same TDP and same price. Oh![]()
Their process is not optimal but considering how well GTX 1050 Ti for example clocks (and how efficient it is) it's definitely not the main problem.The problem is two fold - architecture and process. AMD have a vastly inferior architecture fir gaming and the GF 14LPP process cannot compete with TSMC 16FF+ . The most disturbing aspect is regressing over Fiji in perf/watt and perf/ sq mm and perf/flop with respect to gaming. Vega is an unmitigated disaster 10 years after the HD2900XT.
Sent from my SM-G935V using Tapatalk
Anyone know what percentage of the market owns a Freesync display? I'd bet it's less than 1% overall, and probably about 10% in the market they are aiming for (high-end gaming). I think it's really weird to only show demos with Freesync when only a small percentage of the market owns a Freesync display (unless I'm way off and Freesync is way more popular than I think).
When I'm shopping for a card, I want one that can hit 60FPS minimum at the resolution I want to play at. Right now, that's 2560x1440. I don't care that Vega will look smoother than a 1080 with a Freesync display, I don't own a Freesync display and won't in the near future, just like most of the market. It's just useless information for most people trying to decide if they should wait for Vega or just go buy something from Nvidia. Show me numbers in a wide variety of games (not just Doom) and give me a reason to keep waiting.
Anyone know what percentage of the market owns a Freesync display? I'd bet it's less than 1% overall, and probably about 10% in the market they are aiming for (high-end gaming). I think it's really weird to only show demos with Freesync when only a small percentage of the market owns a Freesync display (unless I'm way off and Freesync is way more popular than I think).
When I'm shopping for a card, I want one that can hit 60FPS minimum at the resolution I want to play at. Right now, that's 2560x1440. I don't care that Vega will look smoother than a 1080 with a Freesync display, I don't own a Freesync display and won't in the near future, just like most of the market. It's just useless information for most people trying to decide if they should wait for Vega or just go buy something from Nvidia. Show me numbers in a wide variety of games (not just Doom) and give me a reason to keep waiting.
Anyone know what percentage of the market owns a Freesync display? I'd bet it's less than 1% overall, and probably about 10% in the market they are aiming for (high-end gaming). I think it's really weird to only show demos with Freesync when only a small percentage of the market owns a Freesync display (unless I'm way off and Freesync is way more popular than I think).
When I'm shopping for a card, I want one that can hit 60FPS minimum at the resolution I want to play at. Right now, that's 2560x1440. I don't care that Vega will look smoother than a 1080 with a Freesync display, I don't own a Freesync display and won't in the near future, just like most of the market. It's just useless information for most people trying to decide if they should wait for Vega or just go buy something from Nvidia. Show me numbers in a wide variety of games (not just Doom) and give me a reason to keep waiting.
Well, official Zotac spec says it's power consumption is 250W. Also some other tests point out it consumes 80W more than FE (which is ~150 W as I know, and i7-6850K used in test consumes ~80 W during gaming acc. to this one though they put 6800K in the chart, but it is 6850)You mean the exact same, factory-OCed card that drew 277W total system power (~=Fury Nano) in a test system with a 5960X? Yeah why don't I believe aftermarket Vega 56s will match that...
![]()
The problem is two fold - architecture and process. AMD have a vastly inferior architecture fir gaming and the GF 14LPP process cannot compete with TSMC 16FF+ . The most disturbing aspect is regressing over Fiji in perf/watt and perf/ sq mm and perf/flop with respect to gaming. Vega is an unmitigated disaster 10 years after the HD2900XT.
Sent from my SM-G935V using Tapatalk
I guess I get why they are comparing to the Fury X but.... wasn't that a super niche card? Does anyone actually own one? Wasn't it rather hot, expensive (at first), and only had 4GB of memory?
I feel like the comparisons to AMD people should be to the 390/480/580. Fury X seems like a collectors item to me, unique card to own but couldn't have sold all that much.
Explain then why Vega is performing as it is touted by AMD? Use logical reasons, and break the architecture down, onto the low level.Can we now get by the "it's the drivers" excuse?
Yeah not sure why that's something to tout when you're only getting 20-30% increase from Fury X. No details there of course and we'd need to wait for full reviews but I assume this testing was with an air card that was downclocking a lot. At least I'd hope so.Here is an internal benchmark pitting Vega against Fury X![]()
Yeah, Vega seems like a misfire. AMD really ought to stay out of the ultra-high end and just focus on doing good "sweet spot" price cards.
What do you mean by DOA? If someone needs a card in Vega 10 or GP104 range, he shouldn't buy any of those? What are the other options?
Here is an internal benchmark pitting Vega against Fury X![]()
Next display I buy will 100% have a sync feature -- its one of the very few technologies that has come out recently that actually sounds like a leap forward and isn't related to VR. People that are spending $400+ on gaming GPU's should also be looking at other important areas to improve their game play experience. Free/Gsync is one of them. It would be foolish to ignore it.
I bet a lot of people have upgraded to these monitors, otherwise they wouldn't be constantly pushing them and making new models all day. Both technologies are nearly universally lauded for being excellent and a must have for gamers these days.
