My attention is elsewhere now. I have bought something like $2000+ of high-end kitchenware in the last few months and hardly game anymore. (My gf is a foodie. And kitchenware holds its value a hell of a lot better than tech, that's for sure, so I am willing to spend more on quality parts.)
That said, yes, I am still running 3 x 22" tri-monitor, but almost all of the time in Extended Desktop mode. Three displays kicks ass and is much more useful than a single larger display for web-browsing, as the minimizing/maximizing is easy. But a single bigger display kicks ass for other applications such as photo-editing. Either setup is probably better than most dual-monitor setups, because it's nice to have either a single huge monitor with easy-to-use split screen (thank you Windows 7) or a centered monitor and two wing monitors, rather than whatever the heck people do with dual-monitor setups (split it down the middle? have one monitor centered and the other growing off the side of it like a tumor?).
Obviously the solution is to get 3 x 30" displays to get the best of both worlds, but I am not willing to spend that kind of money right now.
Anyway, to address the questions, yes, many games (TF2, Oblivion, Left 4 Dead series, Fallout 3/NV, LOTRO, WoW, etc.) run great on even a 6850 1GB at 5040x1050. A non-overclocked 6850 can run Fallout New Vegas at that resolution for instance, with overclocking allowing for even smoother framerates. A 6950 1GB would do even better.
5670x1080 is potentially a problem: 17.6% more pixels to push. But it's doable, and if you run into VRAM problems, just turn down AA a bit and leave textures on high (or turn textures to medium and boost AA higher, though I think most of the time that works worse than the opposite). AA eats VRAM like candy. Going down to 2x MSAA or, gasp, no MSAA, isn't the end of the world, especially when you trade it for deeper immersion and peripheral vision that can translate to a performance edge in multiplayer games, since you have an expanded field of vision. Worse comes to worst, turn off the wing monitors and game on the center monitor at great framerates, for those games that you can't run Eyefinity on. I can run Crysis like a dream on my center monitor.
IMHO, the whole 2GB futureproofing thing is ridiculous for anything less than 3 x 1920 x 1200 so long as you don't insist on having high AA. Futureproofing does not work well with quickly-depreciating assets such as video cards. Buy only what you need for the next 12 months. Heck, I don't even game much anymore, so I am considering downgrading to a 5770 until the 22nm GPUs come out (gaming on the center monitor, tri-monitor for websurfing). An oc'd 5770 still kicks ass at 1680x1050.
I probably sound like a shill for Eyefinity/Surround or something, but seriously, it's awesome. I had a 24" 1920x1200 Dell Ultrasharp and could have spent $$$ getting a GTX480 or 5870 or something to try to max out Metro 2033 or Crysis or something. Instead, I sold the Ultrasharp, got 3 supposedly crappy 22" Acers and a mid-high end video card, and have not regretted it since. I massively boosted my computing experience outside of games with tri-monitor and had a blast with my most-played games (TF2 back then, Fallout NV and L4D2 now). I will sell the monitors later this year and upgrade, probably to 3 x 1080p IPS LED monitors to save energy and for photoshop work, but for me, there is no pressing need to get the highest-end GPUs until the new consoles come out and push PC graphics requirements up again.