That is not true. Games are written in a way which can be played by most video cards in the market, and thus will only use a certain amount of memory. Unlike RAM, there are no page files for VRAM and thus it must not go overboard. Having said that, newer games will require more memory. In other words, more memory means more future proof.
Now if your resolution is at 2560 X 1600, then you more or less need 2Gb of VRAM. This resolution may be a result of multi display or a single XHD display. Both is not too far from the norm, assuming you will keep your card for more than 1 year.
Many eye candy requires data to be fed into VRAM at loading screen, meaning that the more VRAM, the more room for eye candy. Current games utilize 768 Mb iat 1680x1050 at max settings usually, but future games (2010) will probably utilize 1.5Gb at 1680x1050 at max settings.
The real problem of using more memory isn't the hardware, but the software. Many people are still using 32-bit windows and 32-bit program has a softcap at 2gb (lots of PC setup will fail at that usage already), and a hard cap at 4gb (which will not run on 32-bit setup). It isn't far from 64bit apps to come in, and by then, 16gb RAM isn't too much anymore. If engineers can figure out a way to create multiple 32-bit threads, then it may be possible for a 32 bit apps to use more than 2gb, allowing it to produce enrich graphics and utilizing multi-cores. We really should have said bye bye to 32 bit OS and said hello to 64 bit OS 2 years ago.
However, if you really have to buy a new card today, and you don't have a nvidia 3d vision setup, then you should consider ATI as it is more furture proof then any Nvidia product ATM. Buying a 2gb HD4850 makes more sense than a 2gb GTX285 due to tessellation and directcompute. Those are the things that eat VRAM. Yes we don't use them now, but probably in a year or so.