So Pixar isn't using a CPU based render farm? I can't find a source for that. Maybe you can for me?
As far as Mantle for the view port, I know how it has been done. I guess it just depends on whether or not AMD wants to change the business model and try offering a lower cost solution using Mantle. It's not like there isn't a market there if they decide to exploit it.
Renderman has had gpu acceleration support for a long time but it doesn't mean they don't use any cpu instructions in the production pipeline at all right now nor is that relevant. My knowledge is mostly from being an animation geek who attended presentations and watched the dvd commentaries/behind the scenes in almost all the animated features during the recent golden age of cgi films. But even if your currency is a google link, here's
one only three keywords away.
Pixar and Disney have had gpu renderfarms as far back as 2008 during the production of Bolt which is where I first heard of it. Obviously at first it was to allow partial rendered quality on crucial portions of the scenes which let artists waste less time to get the look they were after and that's only what they're showing to journalists at the time. How many shaders are gpu render compatible now, who knows.
What I do know is they've made a presentation at siggraph where they were rendering a scene in real time with production quality on what they described as cuda code. Make that what you will, they won't be disclosing their technical secrets just to settle internet disputes but what I know is that what was shown was previously impossible to do with cpu based rendering.
As far as what's possible to do in their professional card market price point, the pro market is almost completely inelastic. They could have done that anytime without sinking money into a mantle implementation and earn millions when they can earn billions. I highly doubt they would start now.