There's the GPU, which runs at a rated speed, and the memory, which runs at a rated speed.
For example... a Ti4200's GPU runs at 250 Mhz. It's RAM runs at 512 Mhz, which is double data rate, the true clock speed is half that, 256 Mhz.
Without getting too technical, the 256 bit memory interface you're speaking of, vs. a 128 bit interface is basically the difference between a 2 lane and 4 lane highway.
I'm not real sure if that answers your question...