The type of computational physics you are doing must not be very taxing if it only takes up a gig or two. Don't things like LARGE N-Body gravitational models suck up tons of RAM? There are LOTS of applications out there, both commercial and academic that will suck up as much ram as you give them. One that comes to mind is an old hobby of mine from years back. Raytracing. Other examples would be running VMs (as noted above), certain types of distributed computing, academic research etc..
They do, if you do them inefficiently. What you do for large N-Body calculations is a fast multipole algorithm. In this way you collectively model the gravitational pulls from multiple bodies in close proximity as a single effective body. We use the same method for our electromagnetic codes. So I can solve a problem that has millions of unknowns but still only requires N log N CPU time and memory. If I do it the traditional way, it would be N^3 and my limit is around 10,000 unknowns. Really though, memory usually isn't the limiting factor for me, it's just CPU time.
It rarely comes down to something like, "Oh, I have 4 GB but I need 5 GB!" The problems are usually order of magnitude issues. "Oh, I have 4 GB but I need 50 TB!"