AMD Aims To Give OpenGL A Big Boost, “API Won’t Be The Bottleneck”

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
When has ATI/AMD "shitty" OpenGL ever actually mattered? That crap game called Doom 3 ten years ago?
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
When has ATI/AMD "shitty" OpenGL ever actually mattered? That crap game called Doom 3 ten years ago?

It matters now because Apple and mobile have significant marketshare, and those use OpenGL. Both for developers targeting those platforms, and for future products using AMD hardware, it's important that AMD have good OpenGL support.

For example, AMD chips in tablets. Intel has a hell of a lead on AMD there... their drivers are actually good. :whiste:
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Define pros.

Pixar has been using GPU based tesselation for hair and cuda rendering for so long it's more or less an accepted standard. It's no secret that Dreamworks has been partnered with Intel for a long time but even they have started using gpu opengl based rendering in house.

That aside, accelerated viewports for editing aren't really much of a discussion on the tech relevance side because most of the features are driver locked to sell expensive professional cards.

So Pixar isn't using a CPU based render farm? I can't find a source for that. Maybe you can for me?

As far as Mantle for the view port, I know how it has been done. I guess it just depends on whether or not AMD wants to change the business model and try offering a lower cost solution using Mantle. It's not like there isn't a market there if they decide to exploit it.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
So Pixar isn't using a CPU based render farm? I can't find a source for that. Maybe you can for me?

As far as I know they're still using CPUs for their render farms but GPUs are pushing ahead and eventually replace them for the whole pipeline and final render.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
As far as I know they're still using CPUs for their render farms but GPUs are pushing ahead and eventually replace them for the whole pipeline and final render.

I'm by no means an expert, but I know a little bit about 3D rendering from making game models. FWIU not all of the calculations for rendering are massively parallel. Seems like APU's would be the ideal hardware, sometime in the future.
 

taserbro

Senior member
Jun 3, 2010
216
0
76
So Pixar isn't using a CPU based render farm? I can't find a source for that. Maybe you can for me?

As far as Mantle for the view port, I know how it has been done. I guess it just depends on whether or not AMD wants to change the business model and try offering a lower cost solution using Mantle. It's not like there isn't a market there if they decide to exploit it.

Renderman has had gpu acceleration support for a long time but it doesn't mean they don't use any cpu instructions in the production pipeline at all right now nor is that relevant. My knowledge is mostly from being an animation geek who attended presentations and watched the dvd commentaries/behind the scenes in almost all the animated features during the recent golden age of cgi films. But even if your currency is a google link, here's one only three keywords away.

Pixar and Disney have had gpu renderfarms as far back as 2008 during the production of Bolt which is where I first heard of it. Obviously at first it was to allow partial rendered quality on crucial portions of the scenes which let artists waste less time to get the look they were after and that's only what they're showing to journalists at the time. How many shaders are gpu render compatible now, who knows.

What I do know is they've made a presentation at siggraph where they were rendering a scene in real time with production quality on what they described as cuda code. Make that what you will, they won't be disclosing their technical secrets just to settle internet disputes but what I know is that what was shown was previously impossible to do with cpu based rendering.

As far as what's possible to do in their professional card market price point, the pro market is almost completely inelastic. They could have done that anytime without sinking money into a mantle implementation and earn millions when they can earn billions. I highly doubt they would start now.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Shouldnt be hard to give a "big boost" considering how they currently perform in OpenGL.

And how do they perform? 7970GE beats 680/770 in many OpenGL games despite our forum perpetuating the myth that AMD has issues with OpenGL titles.


aamfp%202560.jpg

http://gamegpu.ru/action-/-fps-/-tps/amnesia-a-machine-for-pigs.html

Wolftenstein - http://alienbabeltech.com/main/680-pt-two-overclocking/4/
DX9-19.jpg


One OpenGL title that ran poorly on AMD's cards was RAGE but that has nothing to do with OpenGL and more to do with the texture prefetch used in that game that screwed everything up on AMD cards.

I guess for those people who still play games from the 90s and want to mod them, this will come in handy.

When has ATI/AMD "shitty" OpenGL ever actually mattered? That crap game called Doom 3 ten years ago?

That's not the shocking part. The shocking part is AMD has been faster in OpenGL games since 5870 vs. 480. Wolfenstein and Amnesia I already linked. HD5870 > 480 in Quake 4 & in Prey too.

quake4_2560_1600.gif


prey_2560_1600.gif


HD6970 creams the GTX580 in Wolfenstein (OGL title):

PS-1.jpg


It's impossible to conclude that AMD has "awful" OGL performance when their cards are winning in many OpenGL titles against the competition. AMD has issues in 1 openGL title - RAGE.

AMD wins in Quake 4, Prey, Amnesia, Wolfenstein vs. RAGE for NV. Yup, AMD has awful OGL performance.....:sneaky:

Either way someone first needs to make a good OGL modern game for anyone to care.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
OpenGL may not matter much for Windows, but SteamOS will be another story. The question on OpenGL has more to do with how well they work in Lunix. I have never used Lunix, so someone else will have to enlighten us.