Ya, that makes NV cards even worse value since they depreciate like crazy in the used market later if you don't sell them at the right time. Like GTX280 or 480 lost an outrageous amount of resale value in 2 years. There are guys in Japan who bought a GTX580 3GB for less than $60 USD this month.
http://img831.imageshack.us/img831/7454/gtx5803gb.jpg
http://www.xe.com/ucc/convert/?Amount=5300&From=JPY&To=USD
That means in his country your GTX580 3GB x 2 would have lost $880 USD over 2 years of ownership.
So spending $1,800 on a pair of 6GB GPUs for
potential benefit in games in 2014-2015? Ya, I'd rather pick up 20nm Volcanic Islands or Maxwell or their refreshes when next gen games actually launch and make use of > 3GB of VRAM and save $800 in the process while playing console ports for most of 2013. $499 x 2 2014-2015 GPUs will mop the floor with 2 $1800 Titans. You'll be buying $1,800 worth of GPUs to max out 2013 console ports like DMC, Dead Space 3, Bioshock Infinite and Tomb Raider? Crysis 3 and Metro LL may be the only 2 games in all of 2013 that could push the boundaries, but that's just 2 games. After what happened with 7800GTX 512MB, 8800GTX / U, GTX280 and 480, I would never recommend anyone spend $1,800 to "future-proof". I can understand going from a $250 GPU to a $400 one, but $1,800 to future-proof with 20nm gen out next year? To each his own I suppose.
Which games?
HD7970 3GB beats GTX680 4GB in every single game at 7680x1600 in this review. If 3GB wasn't a bottleneck at 7680x1600, where will it be? I don't even think 2 Titans will be fast enough for 3 x 2560x1600 monitors with 8AA.
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/BF3_02.png
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/AW_02.png
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/Crysis_02.png
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/Deus_02.png
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/Dirt_02.png
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/MOHW_02.png
"That said, we have to wonder why anyone would bother with the GeForce GTX 680
4GB for extreme resolutions when the Radeon HD 7970 GHz Edition was constantly faster at both 5040x1050 and
7680x1600. In fact at 7680x1600 the 7970 GHz Edition was on average 20% faster than the GeForce GTX 680 4GB in the half dozen games that we tested with."