Aha! I knew my high performance supercomputing class would be good for something!
Distributed Computing is a very general term. It just means a problem is distributed over multiple CPUs.
High performance supercomputing (HPSC) has usually been done in the past on a shared-memory machine. That's like a Cray, or Silicon Graphics supercomputer.
But recently, HPSC is being done more and more in a cluster computing environment. This means the CPUs have separate memories, and are generally separate systems, but they have a high-speed link between and among themselves. This is a Beowulf cluster.
SETI@Home is a type of grid computing. Computers are widely spaced, and don't have a dedicated system to cooperate in real time. Comptuers are assigned work that they can take awhile to complete on their own, and then return. In contrast, cluster computing usually involves all the systems working together and sharing data in real time.
But it's true, the lines are blurring on this. As more people and companies get high-speed internet, grid computing is getting closer to cluster computing. This new CERN grid, for instance, looks like it's very close to a giant cluster.