I think your GTX 680 still has a lot of life left, but that could also be said for your CPU as well. Your CPU is 3 generations behind while your GPU is only last gen. I would be extremely tempted to go for a 5GHz ivy. Not just because it sounds badass, but also because it will last you at least 5+ years at current pace.
I don't think we will see high end Maxwells until the end of the year (Q4 2014) and I don't think you have a reason to upgrade. A GTX 680 will easily last you at least another year or two, even if you have high standards.
I hear that, too. Some threads I put in over the last two months chronicled a sudden desire I had to grab an IB-E 4930K ~$500+ processor, an ASUS X-79 Deluxe (new board with a mature chipset), and build a new box. No delidding -- the "E" processors use indium solder such as we're used to.
I was going to lay down something like a preliminary $1,800 buckets-of-ducats just for the itch to build a new system. Other folks here brought me to my senses: the X79 BOARD may be mature with a mature BIOS, but the X79 chipset is old. The BOARD addresses the IB-E processor, but the chipset was originally designed for the SB-E core.
So with this paradox that newer processors with smaller lithography are harder to OC but bring better stock or native performance features, I decided to wait for Haswell "E".
Since the money seems to be burning a hole in my pocket, I'll just take about a fourth of it and get myself a replacement for my GTX 570. Either a GTX 770 or even a 780 Ti. Stocks and flows: more money will pile up while I wait for an X99 chipset and some BIOS with motherboard to get "debugged."
I just posted in Vid Cards and Graphics my latest observation about Intel HD 3000 (or later, I suppose), Lucid Virtu and dGPU. I believe my system was slowed down by running both HDTV and desktop monitor off the GTX 570 in "dGPU mode" with HD3000 plus Lucid enabled. Nor did I notice any difference simply disabling the HD3000 and turning Lucid "off." I had decided to switch to iGPU mode. Wow! a real performance boost. And I'm wondering how the dGPU mode may have contributed to the occasional instability problem -- which so far, seems to have disappeared.