Importance of GPGPU/Compute in future games

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
As the discussion got off-track in this thread, I thought I'd try and continue the discussion here.

GCN has been shown to be quite strong in compute/GPGPU performance - even beating GK110 in quite a few applications/synthetics. So far, this has only translated over to games in titles like Dirt: Showdown and Grid 2, where GCN is quite far ahead of Nvidia (but possibly due to driver tweaks or GCN specific advantages, rather than raw compute performance?) as well as Civ V, where the benchmarked compute performance is much stronger than Kepler yet the ingame performance is identical.

So, given that current games aren't really using compute much, yet GCN may be much stronger in compute performance, will this be an advantage more in the future rather than now? I think so, given the next gen consoles are both GCN based. This means optimizations like Dirt and Grid use could give devs performance advantages for PS4, Xbox, and 1/3-1/2 of PC gamers (just ball parking a guess). The PS4 even is reported to use 4 CU's specifically for increased GPGPU/Compute performance. This would give further reason to rely on GPGPU for game engines.

Now, this doesn't just mean "GCN is the best compute ever and AMD will always be superior". If Nvidia comes out with a superior compute architecture, they could easily take the lead in the future (maybe the GTX500/400 series will gain headway versus the new architectures? doubtful, because drivers make a big difference, but hypothetically could happen!). Maxwell could easily make this happen. Now, that can't overcome GCN-specific enhancements, but that could improve performance on all of the open standard API's that AMD might suggest developers use.

Do you guys think that GCN may gain performance relative to Kepler as time goes on? Do you think future games will use GPGPU computing/it will be an integral part of the game engine in the future? Do you think Maxwell will take the compute crown back for Nvidia?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
No, because developers who bring games to PC know that there are millions of Nvidia cards out there. Some reports put it at 60% Nvidia discrete GPU marketshare.

Plus looking at games like Dirt Showdown. The graphic effects (lighting) that use the compute performance look identical to Dirt 3(i.e. not better looking) while running 50% slower on a GTX 670. If this continues to happen people will avoid buying the games at full price because the developers knowingly cripple performance for no purpose.

That's just my view of it. It's nice for consoles but any optimizations that are had will translate right over easily to DirectX 11 hardware as a whole. This doesn't even take into account that most users are on hardware too old to take advantage of it to begin with.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
No, because developers who bring games to PC know that there are millions of Nvidia cards out there. Some reports put it at 60% Nvidia discrete GPU marketshare.

Nvidia definitely has the PC marketshare, but if designing for the PS4 means you "need" to use compute in order to get close to the full horsepower out of the console, and you can transition those changes over easily to PC and XB1, why not? It's not going to tank Nvidia performance or anything, it is just beneficial to AMD. Plus, I never remember hearing "Damn, even though that game is playable on my card, I'm not going to buy it because AMD gets more FPS!" about any recent game.

Plus looking at games like Dirt Showdown. The graphic effects (lighting) that use the compute performance look identical to Dirt 3(i.e. not better looking) while running 50% slower on a GTX 670. If this continues to happen people will avoid buying the games at full price because the developers knowingly cripple performance for no purpose.

But using compute, even on a "slow" architecture like Kepler, should be giving performance gains. Is there an article that explains how Showdown is 50% slower? I can only see how GPGPU can add performance.

That's just my view of it. It's nice for consoles but any optimizations that are had will translate right over easily to DirectX 11 hardware as a whole. This doesn't even take into account that most users are on hardware too old to take advantage of it to begin with.

Yes, the fact that consoles are all x86 and DX11 definitely is a stronger benefit than compute performance. Well, most users are on old hardware, but they're also playing old games. Crysis 4 isn't made with Pentium 4 users in mind!
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Perhaps if Havok used it to add hardware physics that worked for both AMD and Nvidia, or Nviidia decided to work with AMD to support PhysX cross-GPU.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Nvidia definitely has the PC marketshare, but if designing for the PS4 means you "need" to use compute in order to get close to the full horsepower out of the console, and you can transition those changes over easily to PC and XB1, why not? It's not going to tank Nvidia performance or anything, it is just beneficial to AMD. Plus, I never remember hearing "Damn, even though that game is playable on my card, I'm not going to buy it because AMD gets more FPS!" about any recent game.

No but I have seen people refuse to buy games with artificially crippled performance until it's on sale.

But using compute, even on a "slow" architecture like Kepler, should be giving performance gains. Is there an article that explains how Showdown is 50% slower? I can only see how GPGPU can add performance.

You can see the comparisons here. Notice how the GTX 680 at 1080p gets 126fps with ultra quality but when you turn on advanced lighting it's nearly a50% performance drop?

http://www.anandtech.com/show/6774/nvidias-geforce-gtx-titan-part-2-titans-performance-unveiled/6

Then observe the differences in the screenshots here. Personally I feel it looks worse with the advanced lighting option enabled. It's overbrightening the image and actually degrades the quality of the picture for my tastes. This is rather subjective, but I would not classify this as a graphics upgrade.

http://www.rage3d.com/articles/gaming/codemaster_dirt_showdown_tech_review/index.php?p=2


Yes, the fact that consoles are all x86 and DX11 definitely is a stronger benefit than compute performance. Well, most users are on old hardware, but they're also playing old games. Crysis 4 isn't made with Pentium 4 users in mind!

I meant that people running GTX 400/500 series cards at 1080p and HD 6000 series cards also at 1080p with no itch to upgrade since they can play Crysis 3 etc. They may not have enough power to utilize advanced compute algorithms for game engines. I don't think it's been sufficiently tested though.

I guess in reality it remains to be seen if this plays a factor in the future at all. We shall see.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
No but I have seen people refuse to buy games with artificially crippled performance until it's on sale.

You can see the comparisons here. Notice how the GTX 680 at 1080p gets 126fps with ultra quality but when you turn on advanced lighting it's nearly a50% performance drop?

http://www.anandtech.com/show/6774/nvidias-geforce-gtx-titan-part-2-titans-performance-unveiled/6

Then observe the differences in the screenshots here. Personally I feel it looks worse with the advanced lighting option enabled. It's overbrightening the image and actually degrades the quality of the picture for my tastes. This is rather subjective, but I would not classify this as a graphics upgrade.

http://www.rage3d.com/articles/gaming/codemaster_dirt_showdown_tech_review/index.php?p=2

Nothing is artificially crippled with compute performance, it's enhanced on both, compute-strong cards would be just enhanced more.

Oh, you mean the performance hit! Well, that screenshot comparison really doesn't seem quite right (I'm certain the entire game doesn't look +50000 brightness like that...) but the performance hit is probably because it is "better"/harder lighting, not because it is using compute instead.


I meant that people running GTX 400/500 series cards at 1080p and HD 6000 series cards also at 1080p with no itch to upgrade since they can play Crysis 3 etc. They may not have enough power to utilize advanced compute algorithms for game engines. I don't think it's been sufficiently tested though.

I guess in reality it remains to be seen if this plays a factor in the future at all. We shall see.

But users with GTX400/500 do have more serious compute capability (GTX580 is superior to GTX680 in many aspects). Using compute wouldn't be a detriment to those architectures.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Nothing is artificially crippled with compute performance, it's enhanced on both, compute-strong cards would be just enhanced more.

Oh, you mean the performance hit! Well, that screenshot comparison really doesn't seem quite right (I'm certain the entire game doesn't look +50000 brightness like that...) but the performance hit is probably because it is "better"/harder lighting, not because it is using compute instead.


But users with GTX400/500 do have more serious compute capability (GTX580 is superior to GTX680 in many aspects). Using compute wouldn't be a detriment to those architectures.

It is using direct compute for the lighting. Even the GTX 580 takes a big hit on performance and is slower than the 680 both times anyway. Since the 680 has higher raw performance, it stays ahead. This is the most glaring example out there to my knowledge.

from the article I linked
one of the first games to implement a DirectCompute based forward-rendering compatible lighting system
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Plus looking at games like Dirt Showdown. The graphic effects (lighting) that use the compute performance look identical to Dirt 3(i.e. not better looking) while running 50% slower on a GTX 670. If this continues to happen people will avoid buying the games at full price because the developers knowingly cripple performance for no purpose.

This is the type of response I was referring to when I posted this.

Faster overall compute performance? = Doesn't matter. It's nothing you will ever use and it's only used in games by AMD to make nVidia look bad. You don't need to worry about that.

They are just unfair cheating Bast****. Don't fall for their tricks.

People still bought Crysis 2 with it's overdone tessellation. In a way though it was a good thing. It forced AMD to improve their game and release better rounded components to compete. If not, nVidia could have just kept driving home their advantage. nVidia will have to do the same thing. They won't be able to continue to cripple certain performance aspects to charge extra for them on another product. That's called progress and a win for us consumers.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The 680 doesn't have good compute but the Titan has monsterous compute performance, as does the 780. The 580 also has significant compute performance. There are NVidia cards to cater for those with compute needs and those that don't. AMD is pushing compute because its one way to push past its competitor, but having worked with the compute APIs for the last few years I can say its pretty painful to use. Its not easy to get at the performance on offer, it requires a complete rewrite of the code to put it onto the GPU and it regularly doesn't perform anywhere near as well as I would like. In practice its like the PS3's cells, a lovely idea theoretically but practically very hard to use well.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
This is the type of response I was referring to when I posted this.



People still bought Crysis 2 with it's overdone tessellation. In a way though it was a good thing. It forced AMD to improve their game and release better rounded components to compete. If not, nVidia could have just kept driving home their advantage. nVidia will have to do the same thing. They won't be able to continue to cripple certain performance aspects to charge extra for them on another product. That's called progress and a win for us consumers.

Maybe you have something relevant to the conversation instead of troll bait?

I'm providing links to screenshots that don't make the game look better, it's actually worse and a 50% drop in performance. That's not welcoming by any means and if what BrightCandle says is true, it means a lot of developers won't push it since it would be both a PITA and slower than desired.
 
Last edited:

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
It is using direct compute for the lighting. Even the GTX 580 takes a big hit on performance and is slower than the 680 both times anyway. Since the 680 has higher raw performance, it stays ahead. This is the most glaring example out there to my knowledge.

from the article I linked

Oh, there's no way the 680 is going to somehow fall behind, it just has to do with relative performance (ie. the 680 loses 50%, the 580 loses only 40%). And based on the article, the standard setting (advanced lighting off) uses DirectCompute just as much as the advanced lighting setting (both use Leo principles?).

The 680 doesn't have good compute but the Titan has monsterous compute performance, as does the 780. The 580 also has significant compute performance. There are NVidia cards to cater for those with compute needs and those that don't. AMD is pushing compute because its one way to push past its competitor, but having worked with the compute APIs for the last few years I can say its pretty painful to use. Its not easy to get at the performance on offer, it requires a complete rewrite of the code to put it onto the GPU and it regularly doesn't perform anywhere near as well as I would like. In practice its like the PS3's cells, a lovely idea theoretically but practically very hard to use well.

The Titan actually loses to Tahiti in several compute tests/benchmarks. Nvidia isn't inherently "better" just because it is a Tesla card - maybe Cray would buy Tahiti cards if they had their own Tesla line. Titan being better in compute performance in everything is a bit of a myth.

I definitely agree about coding for GPGPU. It's much different and the API's aren't at the point where they are super easy to use.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Oh, there's no way the 680 is going to somehow fall behind, it just has to do with relative performance (ie. the 680 loses 50%, the 580 loses only 40%). And based on the article, the standard setting (advanced lighting off) uses DirectCompute just as much as the advanced lighting setting (both use Leo principles?).

I've tried searching for that but nobody seems to say how much direct compute power is used or is able to put a number on it. Both do use direct compute, but advanced lighting is definitely using more compute power. This is evidenced by the performance results. I've read that other small things use some compute power, but it doesn't affect performance adversely.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Maybe you have something relevant to the conversation instead of troll bait?

I'm providing links to screenshots that don't make the game look better, it's actually worse and a 50% drop in performance. That's not welcoming by any means and if what BrightCandle says is true, it means a lot of developers won't push it since it would be both a PITA and slower than desired.

It's slower because it's overdone.

It looks worse because it's overdone.

That doesn't mean that if they had the same lighting effects done via compute as not, it wouldn't run better. We really don't know. Using compute to do lighting doesn't automatically make it obnoxiously overdone, either.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
I've tried searching for that but nobody seems to say how much direct compute power is used or is able to put a number on it. Both do use direct compute, but advanced lighting is definitely using more compute power. This is evidenced by the performance results. I've read that other small things use some compute power, but it doesn't affect performance adversely.

The PR blurb definitely said there was GCN specific enhancements, but it's not clear if that means there are engine tweaks, driver tweaks, or game design choices for GCN. It could be like BC mining where some important functions are much faster on GCN, or it could just be very well done driver tweaks that give them big performance gains.

Using compute should never be inherently worse. By standard definition, it's taking work that would have been done (slowly) by the CPU and doing it with a massively parallel GPU. If they are using it for something that the GPU normally does with the standard render engine, then yes they could lose performance for sure... but as you say, details are super sparse about how they are doing it.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
It's slower because it's overdone.

It looks worse because it's overdone.

That doesn't mean that if they had the same lighting effects done via compute as not, it wouldn't run better. We really don't know. Using compute to do lighting doesn't automatically make it obnoxiously overdone, either.

No, but this is the best example of where I feel it's used to the detriment of the game. I'm unaware of any other severe issues. As I said, if it gives no visual upgrade and tanks performance, I'm not sold. I don't think it's a bad idea...just that nobody has proven it to be the future yet. So I can't say I'm exactly thrilled with it at this point in time.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Maybe you have something relevant to the conversation instead of troll bait?

I'm providing links to screenshots that don't make the game look better, it's actually worse and a 50% drop in performance. That's not welcoming by any means and if what BrightCandle says is true, it means a lot of developers won't push it since it would be both a PITA and slower than desired.

I did respond to what you posted. Sorry, if you proved my point about how some people combat AMD's strengths by simply dismissing them. As far as your screenshots go, they don't prove anything, because they don't address the tech. The advanced lighting used in Showdown allows for "real" dynamic lighting with multiple sources. Without it we are relegated to light maps (glow maps) without any realistic lighting from a large number of sources. That's what it adds. You can like or dislike Codemaster's artistic approach, but that doesn't really matter. It's a definite advancement in real time lighting.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Nope...didn't dismiss anything. I do know what it does, and no it doesn't look better.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
It's slower because it's overdone.

It looks worse because it's overdone.

That doesn't mean that if they had the same lighting effects done via compute as not, it wouldn't run better. We really don't know. Using compute to do lighting doesn't automatically make it obnoxiously overdone, either.

It allows for a large number of light sources (hundreds, in theory) to be rendered in real time. Without it we are limited to just a few (most times one) dynamic light source before the performance hit makes it too slow.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Nope...didn't dismiss anything. I do know what it does, and no it doesn't look better.

that's your subjective opinion. You are entitled to it, but it doesn't address anything that is being done.

Of course it makes it look better by making it look more realistic. A lightmap can't be compared to a real dynamic light source. You trying to say it's not an improvement is just denial. All you are addressing is the artistic implementation and using that to dismiss the tech.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Look at what UE4 can do, it uses direct compute for some lighting, physics, and particles. It doesn't take huge hits from using it either. Thus far their tech demos were running at on a GTX 680. So It can be used well, but it does require a good developer putting it in the right areas.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Look at what UE4 can do, it uses direct compute for some lighting, physics, and particles. It doesn't take huge hits from using it either. Thus far their tech demos were running at on a GTX 680. So It can be used well, but it does require a good developer putting it in the right areas.

Showdown's advanced lighting runs on nVidia as well. AMD (so far) doesn't play those games with feature lock outs.

So now it's poor developer? Codemasters is a top dev., as well. They are generally considered the best out there at racing games.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
Look at what UE4 can do, it uses direct compute for some lighting, physics, and particles. It doesn't take huge hits from using it either. Thus far their tech demos were running at on a GTX 680. So It can be used well, but it does require a good developer putting it in the right areas.

Well that's the nature of a true dynamic engine vs. a static implementation - of course, the dynamic model doesn't really get the best showing in a racing game... tracks don't really move a lot ^_^

Showdown's advanced lighting runs on nVidia as well. AMD (so far) doesn't play those games with feature lock outs.

So now it's poor developer? Codemasters is a top dev., as well. They are generally considered the best out there at racing games.

If AMD ever starts the lockdown feature game I'll be one unhappy consumer (lockouts on both sides... lame).
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Showdown's advanced lighting runs on nVidia as well. AMD (so far) doesn't play those games with feature lock outs.

So now it's poor developer? Codemasters is a top dev., as well. They are generally considered the best out there at racing games.

Again with the troll baiting? Common man...feature lock out discussion again? Is that all you think about?

I'd argue that there are better developers when it comes to racing games, but whatever. The point is their usage of direct compute was unrefined. I think Epic has a bit more respect than codemasters when it comes to building a graphics engine as well.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
The point is their usage of direct compute was unrefined. I think Epic has a bit more respect than codemasters when it comes to building a graphics engine as well.

It's not clear at all whether the performance hit is because their engine is coded poorly, or if it is because their engine is doing a ton more work than "lazy" lighting. It's like the difference between rendering a 3d model and displaying a .jpg of one...
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Either way, the performance sucks and that's just not good. Not so subtle but it's the way I see it.

You can render a million things at once and do all this fancy lighting but if it hinders performance to the point it's unplayable (in some cases this did) then it seems like a waste. At this time, with current hardware in the market today...I don't see this becoming the norm at all. Maybe we will learn more about how UE4 utilized direct compute and get a better idea of how we might begin to see it used going forward.
 
Last edited: