[Tom's] DICE: CPU & GPU Integration = More Clever Techniques

NTMBK

Lifer
Nov 14, 2011
10,237
5,020
136
Paraphrased a bit for the thread title, because it was long enough as it is :) Full quote:

Johan: [...] We want to enable, and I mentioned a bit about it last week during my talk last week about Mantle, was enabling the GPU to execute in a little bit more of a heterogeneous fashion of being able to run multiple compute shaders in parallel with your graphics work and ideally having more collaboration between the CPU and GPU. We can do things like that on the console because they're integrated machines, so the CPU and GPU are on the same die. On the PC you are seeing it more and more with the APUs and Intel's Ultrabooks that also have integrated graphics.

I want to see more of this type of collaboration between CPU and GPU to drive many more advanced rendering techniques. For example once we've rendered the Z-buffer for a scene then we know the depth of every single pixel that we have in that our frustum and based on that information we can actually do things like shadow maps that are adapted specifically only to cover the area that they actually need to. Typically you don’t really have that knowledge and on the CPU you prepare that data that the GPU will render a few frames later, so you have to brute force a lot of things. You have to send out a lot of work and you can't really be reactive. With many of the things that we can do with Mantle and I think going forward also with closer CPU and GPU interaction in general we can do a lot more clever techniques and less brute force type of techniques as well. That's a pretty frequent topic the when we talk with architects.

Taken from http://www.tomshardware.co.uk/johan-andersson-battlefield-4-interview,review-32839.html
 

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
This sort of thing will become more widespread and used as the new consoles age and developers start developing specifically for them, rather than developing for current gen consoles and extending that.

Certainly going to change a lot of things probably under the hood, but we were already moving them with the PC anyway, so it just means it will actually get used now, rather than being a potential feature which never gets much done with it.

Probably going to benefit AMD first though, since it's their hetero in consoles and so it will probably be easier to port to PC.
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
Maybe the discrete GPU will die because of this?

I mean, years ago, discrete audio was huge. Now 1% of gamers have discrete cards.
 

krumme

Diamond Member
Oct 9, 2009
5,952
1,585
136
This is a mantle issue. Because mantle is the only current solution on pc side to do what johan wishes. And he knows. Its not doable with dx.

In the mantle thread its described how mantle does it.

(edit: its only with mantle execution model where the developer can decide for certain where to feature should go)
 
Last edited:

Noctifer616

Senior member
Nov 5, 2013
380
0
76
Maybe the discrete GPU will die because of this?

I mean, years ago, discrete audio was huge. Now 1% of gamers have discrete cards.

As long as graphics will improve, and as long as there are gamers that care about a lot of eye candy, we won't see the death of discrete graphics.

4k gaming hasn't become a standard yet for most gamers, but one day it probably will. Gaming at 60fps @ 4k takes a ton of power for games like BF4, now imagine when gamers demand 120 FPS @ 4k. And we might not even stop at 4k, Display Port is working on 8k resolution support.

Also, modern games could push graphics even further. What if in 10 yeas a game like BF will have maps that range 100 square kilometers with thousands of players on a single map?

As long as game developers push the scale of worlds and graphical detail in their game, there will be a need for high performance discrete graphics.
 

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
Please focus on the OP and do not derail this into a separate parallel Mantle discussion -- keep Mantle in the Mantle thread.
-- stahlhart
 

NTMBK

Lifer
Nov 14, 2011
10,237
5,020
136
As long as graphics will improve, and as long as there are gamers that care about a lot of eye candy, we won't see the death of discrete graphics.

4k gaming hasn't become a standard yet for most gamers, but one day it probably will. Gaming at 60fps @ 4k takes a ton of power for games like BF4, now imagine when gamers demand 120 FPS @ 4k. And we might not even stop at 4k, Display Port is working on 8k resolution support.

Also, modern games could push graphics even further. What if in 10 yeas a game like BF will have maps that range 100 square kilometers with thousands of players on a single map?

As long as game developers push the scale of worlds and graphical detail in their game, there will be a need for high performance discrete graphics.

We won't see the end of high performance graphics, I agree. But why does that need to be discrete? Putting the graphics processor at the other end of a PCIe bus just adds one more bottleneck and round of latency. It kills the possibilities of the kind of fine-grained parallelism that Johan is talking about in the interview. There are more efficient (i.e. better performing) algorithms which you cannot feasibly use in a discrete CPU + GPU system. There was an AMD talk about a year ago talking about the same sort of things, detailing some specific algorithms which only work in an APU setup- I'll see if I can't dig it up. But it's good to hear a real world developer talking about it too. I'm excited to see what tricks can be squeezed out of the new consoles.
 

NIGELG

Senior member
Nov 4, 2009
851
31
91
Maybe the discrete GPU will die because of this?

I mean, years ago, discrete audio was huge. Now 1% of gamers have discrete cards.
Integration can be good.

I am glad I am in the so called 1% of gamers that have a discrete audio card.It's like night and day in terms of sound quality.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
We won't see the end of high performance graphics, I agree. But why does that need to be discrete? Putting the graphics processor at the other end of a PCIe bus just adds one more bottleneck and round of latency. It kills the possibilities of the kind of fine-grained parallelism that Johan is talking about in the interview. There are more efficient (i.e. better performing) algorithms which you cannot feasibly use in a discrete CPU + GPU system. There was an AMD talk about a year ago talking about the same sort of things, detailing some specific algorithms which only work in an APU setup- I'll see if I can't dig it up. But it's good to hear a real world developer talking about it too. I'm excited to see what tricks can be squeezed out of the new consoles.

Exactly, AMD and Intel are both putting faster and faster graphics on their CPU's. In a few years we will most likely have mid range graphics power built onto the CPU. Only higher end GPU's will be discrete.
 

DownTheSky

Senior member
Apr 7, 2013
787
156
106
Exactly, AMD and Intel are both putting faster and faster graphics on their CPU's. In a few years we will most likely have mid range graphics power built onto the CPU. Only higher end GPU's will be discrete.

That's only feasible if they're willing to go past 400sq mm on their APUs.
 

NTMBK

Lifer
Nov 14, 2011
10,237
5,020
136
Shouldn't this thread be merged with the existing mantle thread?

This is nothing to do with Mantle. Clever algorithmic tricks can be performed using any API- DirectX, OpenGL, OpenCL, Mantle, XBox One API... Anything running on an APU, whether that is an AMD or an Intel system. (Notice that Johan specifically calls out Intels Ultrabooks!)

I dug out that presentation I was thinking of, by the way: http://www.slideshare.net/zlatan4177/gpgpu-algorithms-in-games Quite a few examples of tasks which would not be feasible to do on a GPU without unified memory, and which can now be accelerated due to a unified architecture. Until the next gen consoles came along this was all mostly speculative, as APU market share is too small for developers to target it- but if they are developing these new approaches for the consoles, you can bet that PC ports will see similar improvements on APUs.
 
Aug 11, 2008
10,451
642
126
I dont see discrete graphics going away anytime soon. There are just too many hurdles with an APU. Bandwidth, thermal issues and die size to mention a few. Not to mention that if you want to upgrade an APU, you have to change out the cpu as well.

APUs may become more suitable for low/mid range gaming, but for upper mid range and higher, I just dont see how an APU can do it.
 

NTMBK

Lifer
Nov 14, 2011
10,237
5,020
136
I dont see discrete graphics going away anytime soon. There are just too many hurdles with an APU. Bandwidth, thermal issues and die size to mention a few. Not to mention that if you want to upgrade an APU, you have to change out the cpu as well.

APUs may become more suitable for low/mid range gaming, but for upper mid range and higher, I just dont see how an APU can do it.

Bandwidth is solved as a long term issue- stacked memory dies, HBM, HMC, on package RAM... it's not going to be a problem. Obviously it will still be for some time to come, but I can't see it still being an issue in ~5 years.

Die size? I really don't see that being an issue, either. The consoles have demonstrated that you can make a pretty damn big APU just fine. The consoles have dies almost as large as Tahiti.

Cooling is still the big issue though, yes. It's a shame that BTX never caught on- that setup of a single massive front mounted 120mm fan blowing air directly through the CPU heatsink would have been ideal for cooling a really big APU. But history is history, I guess. ATX just isn't that well designed for keeping very big CPUs cool. But it's not impossible- AMD have released a 220W CPU, don't forget, which works on ATX motherboards and using a standard, out of the box AIO water loop. And desktop-sized workstations do just fine with hot CPUs. The Dell T7600 will happily keep a pair of 130W processors running cool, and relatively quietly- certainly much quieter than your average graphics card coolers. It is still the single biggest issue though, I agree. A new motherboard layout similar to BTX would be my preference, but I guess that would probably have as much success as the original BTX did. :)

But the next gen consoles doesn't have especially exotic cooling systems. The PS4 has a single mid-sized blower fan, similar to Intel's Thin Mini-ITX platform:

PZAftggR6IfwqUfD.medium


And the XBox One has an even more conventional cooler- a single big, top-down fan.

6CudjFuUBs32otCX.huge


Big, but not a million miles from e.g. a Scythe Big Shuriken 2. You could still fit something of a similar size into a mini-ITX build just fine, never mind an ATX sized case. There would still be plenty of room to strap an even larger cooler in there, if you needed to cool an even more powerful APU.

I just feel that the technical benefits of going to combined system, as outlined by Johan in the interview, could well outweigh the technical challenges of putting the whole thing in one socket.
 
Aug 11, 2008
10,451
642
126
You could be right, I won't say it is impossible. OTOH, this article in TH was just basically a rehash of the amd PR from apu 13. It all sounds good in theory, but there are no hard numbers that prove what the benefits will be.
 

NTMBK

Lifer
Nov 14, 2011
10,237
5,020
136
You could be right, I won't say it is impossible. OTOH, this article in TH was just basically a rehash of the amd PR from apu 13. It all sounds good in theory, but there are no hard numbers that prove what the benefits will be.

Yeah, we need to wait and see what the benefits will turn out to be. I'm certainly not saying it definitely will happen, either! I'm just excited to see what cool new tricks can be pulled with this setup. :)