I'd estimate that such GPU's will be there in a two years period, more or less perhaps. The end problem is always the resolution though. It is possible to attain 60+ in DX10 with 4xAA and 16AF, but you got to play below 1280x1024, which many enthusiast either refuse to do or just can't since their LCD's native res is something along the lines of... well, of too much for Crysis to run at 60FPS.
The only way I can think of that is possible to do so at high resolutions on today's LCD's is to get Tri-SLi, Quad-SLi or CrossFireQuad-CrossFire using the X2 ATi variants (4870X2 x 2, or 4870X2 x 4). But such systems are well beyond even the most dedicated enthusiasts' financial capabilities. Those who have such setups usually get their cards free from some groups or something along those lines, or they just happen to have a money-growing tree in their backyard.
Basically, Crysis is just a game that happens to be a demonstration (perhaps not on purpose, who knows) of what's going to be the standard in about two or three years from now, perhaps even on the next generation of consoles too. And I do remember reading that CryTek plans on increasing the graphics quality dramatically over the next Crysis titles (it's a trilogy, let's not forget that, and Warhead don't count, as what I read was referring to the big releases, true sequels of the original), they talked about "cinematic quality levels".
I can hardly imagine anything better looking than Crysis, but I also said that I could hardly imagine anything looking better than Perfect Dark back in the days, I also said it for Halo, and then for Half-Life 2 and then for FarCry... it's always happening over the years, we always end up attaining new levels of "realism" in graphics quality and even perception. The $1000 question is: where will it stop?