:thumbsup: Only a fraction of PC gamers are even gaming at 2560x1440 or above. No point at all even talking about 7680x4320 for another 5-10 years.
Right now a 32-inch 4K TV
costs as much as a car.
http://www.engadget.com/2012/06/05/viewsonic-vp3280-led-4k-monitor-hands-on/
I will be upgrading to a 37 inch 4K LCD/LED/OLED/Plasma for gaming when it hits $1000-1,500. I can't see that happening for at least 5-6 years, maybe more.
Also, people are forgetting that right now we are at the most distorted point in PC gaming graphics. GPUs are 15-20x faster than RSX/R500 in PS3/360 but graphics have hardly improved by a factor of 2x, maybe 3x with mods/textures, etc. Once next generation of consoles launch by end of 2013/2014, I expect DX11 to really take off and an exponential increase in graphics. That would hammer videocards where once again we'll be scrambling to upgrade. That's just my person opinion and it may be entirely wrong if next generation consoles are weak.
Either way, as PC graphics become more complex, our GPUs will start choking even at 2560x1600. The industry goes in cycles. Since right now we are at the end of a 2005-2006 console generation, it appears that we have ample GPU power reserves but looking at next generation lighting and shadow effects in games such as Dirt Showdown, Sniper Elite and Sleeping Dogs, it tells me all those new graphical features that improve graphics just a little bit are going to start costing us A TON of graphical performance.
It makes sense if you think about it. Going from textured hair to 50,000 individual hair strands of hair all reacting to wind requires 10x the GPU increase in horsepower.
Imagine GTA VI with a world 10x more populated world/streets than GTA V. It'll make a GTX680 go at 5 fps in no time at all. We'll get there eventually.