- Sep 5, 2003
- 19,458
- 765
- 126
I'm resigned to the fact that both Nvidia and AMD are going to be charging significantly more per mm^2 for the chips, I just hope that what you are saying happens. I mean, it has to, right?
Who knows with this new rumor that GTX700/HD8000 may be delayed to Q4 2013. Could this mean Titan is just a limited collector's edition enthusiast card and NV/AMD will continue to sell HD7970/GTX680 for at least 7-8 more months through the summer?
Personally for me if AMD released HD8970 or even HD8990 with substantially more SPs, I would upgrade for bitcoin mining. The longer they delay HD8000 series, the less likely I'll buy it because the more likely that ASICs will make bitcoin mining impossible/difficulty would skyrocket to the point where it's not worth mining. Then if I have to outlay $800-900 out of my own pocket for next gen GPUs, I'll resort to assessing the value of upgrading based on the # of demanding games and their quality in 2013. I don't have a 2560x1440/1600 monitor and 120 fps is not something I care about.
So far Crysis 3's preliminary graphics aren't impressing me personally. Also, I can probably live with VHQ and no MSAA for that game vs. having to drop $900 for the Titan just for 4xMSAA/SMAA. I can't see how games like Bioshock Infinite, Dead Space 3, Aliens Colonial Marines, Starcraft II HoTS, Tomb Raider, or Company of Heroes 2 will give my card problems. Metro LL? Not sure when that's launching with the bankruptcy of THQ. Rockstar just said GTA V is delayed to Sept 17 and they provided no news on the PC version. Star Wars 1313 and Watch Dogs seem to be aimed at next gen consoles, or Q4 2013 at the earliest.
The way things are looking, I am most likely skipping this next generation and going to upgrade when 20nm GPUs arrive as I honestly expect a $499 Maxwell to offer performance similar to the GTX690/Titan. Maxwell's architecture also sounds very impressive from a compute point of view. Once bitcoin mining dies, I'll shift my GPU to distributed computing projects again in which case Maxwell's astonishing GFLOPs/watt projections from Fermi/Kepler generations look like something I am interested in. Maybe I am getting older but with CPUs becoming less and less interesting, I'd like to use the GPU for things outside of gaming. Granted I am assuming 20nm parts don't get delayed to 2015.
Last edited: