Yeah, price/perf. A metric used always by everyone except the market leader.
If NV released Titan II with 20% more performance over Titan I/R9 290X, you'd pay 3000 EUR for it? Please tell us more.
I bet 97% of the GPU market considers this metric. It doesn't mean they ignore other aspects such as performance/watt, game bundles, features, etc. If they didn't consider price/perf, the VGA forum would be a pretty empty place. We would all go out and buy 4 Titans and come back to the forums in 2 years when 4 new $1000 flagships arrive.
So, give me a reason why people should now buy a new card?
They were too busy doing other things in the summer like house renovation, travelling, and didn't have time for games. Now that winter is approaching (i.e., gaming is probably a more popular activity for many people during the cold months in North America), and BF4 is here, people will consider upgrading their systems.
Another possibility is that they may have lacked the funds due to other personal commitments. Others may have waited to build a new Haswell system with a new GPU and waited until now to put it together due to Intel's USB 3.0 bug fix.
Further not all gamers follow the GPU market closely. Some may be going to school and are putting together a new system now that they received student loans. Then there is the holiday season buying / gifting. I presume people may buy a new GPU for their family members too.
Maybe if you someone is a huge
"]COH2 gamer?
There are many other reasons; for example, if R9 290X beats Titan in BF4.
They can wait another 8 months and will get this kind of performance for half the price..
Source? Where are there guarantees that we'll have 20nm chip with GTX780/R9 290X's performance for $299-325 in 8 months? I'd love to hear more.
I am skipping R9 290X and 780 since the price/performance is not sufficient for me to upgrade but given the progress of GPUs of late and 20nm delays, it'll be a while before we can get this level of performance for $299. Not everyone may have 7970s/680s in their rig.
Of course none of this has anything to do with Mantle. Care to explain why is it when NV worked for 10 years with developers to optimize performance for its games via TWIMTBP this was considered "excellent developer relationships", but when AMD takes it 1 step further and gives developers access to lower-level API, it's considered "evil" and "unfair". Really now?
Working closer with developers, bundling more games, raising prices. That sounds so familiar. What other GPU company perfected this strategy before? :hmm:
If Mantle takes off, maybe NV will price 20nm 550mm2 Maxwell at $499-549 instead of $1,000. I can see a lot of positive coming out of this if AMD's cards gain 20-30% from Mantle. Mantle will mean more competition, not less, since it will force NV to try that much harder.
Everyone is looking at GCN in both consoles, in PC gaming desktops and the Mantle API available for all three platforms; PS4, XB One and PC.
MS already stated they are committed to a
10 year life cycle for XB1. If developers could squeeze even 20-30% from XB1/PS4 via Mantle, you can bet some will try it. Since AAA games strive for higher production values, graphics and performance, I can bet the top studios/1st party developers of XB1/PS4 will definitely want to learn more about Mantle. Once they learn how to use Mantle to tap into GCN of XB1/PS4, assuming it brings better performance, it may be better for them since they will end up with a game hitting that magical 30 fps console gamers are OK with vs. sluggishness of 23 fps, while delivering the level of graphics / # of NPCs on screen they desire.