All I've seen in this thread to support the mainframe argument are instances of corporations saving money... which is dubious when they are
replacing all of their current hardware with >$500k mainframes and either retraining all their support staff or hiring new employees. If you hire new employees that know mainframes at the same level of knowledge as your old client-server people, then you pay big $$ because there aren't that many of them. Maybe total operating expenses go down, but total IT expenses for at least the next year would be much higher than in previous years.
The thing is that mainframe computing didn't go the way of the dinosaur just because it became cheaper to give everyone their own box. There's huge momentum against that kind of thing... just think what a CS education back the mainframe computing's heyday was. You learn how to program mainframes, IT people learned how to maintain mainframes... and there wasn't the "Internet" like it exists today. So an IT person's main source of knowledge was from magazines, books, and other people. You couldn't say "hey, this client-server thing looks promising, what's the details?" go to yahoo.com and search for client-server to read all the pros and cons of it. When all your senior people are trained in "the old way" it takes a lot of time and effort for the new people to convince them to go to a new architecture, especially when it has limited support and limited software. And despite all these problems, client-server computing replaced mainframes in many businesses for many tasks.
It's also important to note that through the whole shift from mainframe-dummy to client-server, mainframes have always been more reliable, robust, and manageable than even the biggest, most refined servers. It's not like there's anything new that mainframes have to offer that they haven't always offered... mainframe companies (IBM) didn't stop making mainframes 10 years ago and start again last year. They've always been there, "lower total cost" and all.
You can't take your dummy terminal home like you can a laptop.
And my final point: these mainframes wouldn't run Windows. So to run all the applications "you really need" like Office, you would have to emulate Windows. Windows can hardly run stable outside of an emulator.
I'm not saying mainframe computing is bad, or it doesn't make sense for anyone, or trying to say it's somehow wholly inferior... I definitely think there's a place for it and it will coexist with the client-server architecture. I'm just saying it won't come back en masse and replace the client-server architecture that is widely used.