They're extremely useful for doing very advanced simulations (IE: weather, nuclear, IC Engine analysis, etc.) and computationally intensive research in genetic, scientific, economic, statistical, and medical fields.
I also think that their role in genetics is going to be absolutly groundbreaking (remember jurrasic park? Well, hypothetically speaking it was never out of the question assuming you had enough computational power backing you up) and we're going to see some very interesting (or contriversial, imo) ramifications of their computing power.
I don't know if this counts as "proof" of usefulness, but I know that we basically have an IBM supercomputer at my work (capable of handling 30,000 simultanious users, theoretically) and we use it for very advanced real time computation. As it is, we're reaching the limits of that machine's computing power running applications that save the company millions of dollars a year.