RussianSensation
Elite Member
- Sep 5, 2003
- 19,458
- 765
- 126
So? You can easily do the math (which I did in my post) and see the the ChipHell chart does not line up with actual reviews. ComputerBase.de gives the Titan X a 1% lead in 18 games @ 4k.
That's why we look at multiple sites. Doesn't sound like you spent the time reading various reviews because you wouldn't be discounting Sweclockers review if you did.
Sweclockers has 7% delta at 4K.
http://www.sweclockers.com/test/20523-nvidia-geforce-gtx-980-ti/19#content
So does TechSpot.
"The GeForce GTX 980 Ti certainly delivers on the performance front. From the 20 games tested, we found it to be a mere 7% slower than the Titan X at 2560x1600 and 3840x2160."
http://www.techspot.com/review/1011-nvidia-geforce-gtx-980-ti/page10.html
This idea that TPU and AT are some Hailed from God websites that are only right and no one else should count doesn't work. Not to mention neither AT nor TPU uses the most intense settings at 4K, Techspot and Sweclockers do. If anything, Techspot and Sweclockers reviews are more relevant when comparing most GPU limited scenarios since they run everything to the max - 4xMSAA at 4K. AT and TPU do not use 4xMSAA in a lot of 4K benchmarks. That puts less stress on the GPU which doesn't allow the Titan X's shaders to be utilized fully.
Granted, the 7% extra on the Titan X doesn't make it that much more playable at 4K but you completely discounted Sweclockers charts just because.
Sorry but that is disappointing, whether the card is 8" or 11.5". These top of the line cards are just for bragging rights, AFAIK they account for less than 5% of the total sold. Unless AMD releases a mainstream card which beats 980 every which way but loose, IMHO it's too little, too late.
As mentioned by others, there should be air cooled Fiji PRO and Fiji XT at some point. AMD will not have a line-up consisting of a $399 Hawaii rebrand and an $850 Fiji Fury card. Secondly, AMD will not be able to afford to throw out all those Fiji chips which don't yield 100%. Think about that for a second - when did AMD ever release such a line-up with such a major price disparity between mid-range and flagship cards and also never had a cut-down flagship card? That never happened.
If they decide to just release a minor clock bump of the existing Hawaii chip instead, then it'll be a long 18 months while Nvidia continues to eat into their market share and AMD becomes a joke in the enthusiast community.
So hold on, if we add 10% performance to each R9 290 and R9 290X, add HDMI 2.0, and these cards get double the VRAM to 8GB, then at $299 and $399 they are a gigantic fail against a $330 970 3.5GB and $499 GTX980 according to you? So you are saying you'd personally pay $100 more for a 980 and buy a 3.5GB slower card in the 970?
Sorry but if R9 390X has 8GB of VRAM and comes in at 97-99% on this chart, the 980 at $499 is irrelevant.
980 uses 165W of power in the lowest possible case.
R9 390X let's say would use 275W of power, or 110W more.
20 hours of gaming a week x 52 weeks in a year x 15 cents per kWh x 110W power delta = $17.16 per year
30 hours of gaming a week gives us $25.74 a year.
20 hours gaming / week = Number of years to break-even = $100 / $17.16 = 5.8 years.
30 hours gaming / week = 3.89 years to break even!
However, my analysis just compared TDPs on paper, but we know for 100% fact that's NOT how it works in the real world. When comparing total power usage of a gaming rig, TDPs are irrelevant in games.
Core i7 5960X and R9 290X vs. 980 we get:
36W difference in Bioshock
42W in Metro
33W in Tomb Raider
Ya so it's going to take 10+ years to just break-even on the electricity costs between a $499 980 and a $399 R9 290X card with a similar power usage as a 290X.
Looks like you need to bring a much stronger argument why a higher clocked 8GB GDDR5 $299 R9 390 and $399 390X are going to be the laughing stock of the gaming community......Also, why would anyone buy a slower 970 3.5GB card if there is a faster R9 390 8GB version?
Last edited:
