Linus Torvalds: Discrete GPUs are going away

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

naukkis

Golden Member
Jun 5, 2002
1,004
849
136
Linus point is that there's so much to gain with unified cpu-gpu address space that eventually there's no other way than put them in single-chip. And as gpus get growing and cpus keep shrinking combining those together is simply inevitable.

Monster GPU's ain't there for gaming but for hpc utilities. And Intel's Knights Landing is first to be able to run os without master cpu, nVidia's Maxwell will be second. AMD's solutions is still unknown but there's no other possibility any more than put cpu into hpc GPU.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
But its still stupid to say that. We are more likely to see GPU-cpu with graphics that has a small cpu attached to them, rather than cpu's with attached gpu's.
An irrelevant distinction. You might as well call most processors SRAMs with attached CPUs.
 

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
And Intel's Knights Landing is first to be able to run os without master cpu, nVidia's Maxwell will be second. AMD's solutions is still unknown but there's no other possibility any more than put cpu into hpc GPU.

Actually, it sounds like NVidia is pushing it back to Pascal.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
There will eventually come a time when igps can provide photorealistic graphics at high performance. On that day discrete GPUs die.
 

Mand

Senior member
Jan 13, 2014
664
0
0
There will eventually come a time when igps can provide photorealistic graphics at high performance. On that day discrete GPUs die.

Unless they're doing other things.

Photorealism isn't some hard cutoff beyond which we don't need anything else.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
There will eventually come a time when igps can provide photorealistic graphics at high performance. On that day discrete GPUs die.

Not if you hit limits in what the manufacturing technology can be extended to do first. Those limits may then be overcome with something else but who knows what type of timescale we're talking about here.

I expect the dGPU market to shrink, maybe a lot, possibly exiting entirely from laptops where there are bigger incentives to use IGPs. But I don't agree that it'll reach a point where it makes no sense financially any time soon, so long as the same basic GPU designs are being used in their IGP and HPC offerings.

The thing with IGPs is that the highest end needs exotic memory solutions like extra eDRAM or special memory channels. So we're already talking about a special product catering to a niche, not something you want to provide with every CPU you sell because it's a big cost for something most people don't want. Even using the biggest GPUs possible probably doesn't make sense financially and there's no indication that Intel is moving in this direction. You would have to reach a point where offering a truly great GPU only costs pennies and I don't think we're anywhere close to that, definitely not within the next 10 years and not something attainable before node scaling as we know it dries up. So so long as the "good" IGPs are still a niche selection that you pay a premium for there's a lot less incentive to use them over dGPUs on the desktop.

The biggest threat would be if both AMD and nVidia fell apart completely due to being unable to keep up with their competitors. If that happened I don't think Intel would be willing to fill the dGPU void, even though there'd be a demand for it.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
It is inevitable that every component of the PC that lives separately from the CPU will eventually make its way into it given more shrinkage in transistors. Todays super computer is tomorrows desktop CPU if they can keep the silicon process improvements coming.

Given enough time the CPU side could be small enough to make putting 90% of the performance of a GPU right next to the core. Its going to be a while before its practical but its inevitable that it will become possible then a reality. All discrete parts eventually will disappear in time as the CPU puts more and more of its transistors to activities other than CPU computation. A modern CPU has very little in the way of actual execution units for CPU tasks, its mostly a big cache and GPU nowadays. Its just not all that soon this will happen, several silicon jumps are necessary before it even begins to be possible.
 

TeknoBug

Platinum Member
Oct 2, 2013
2,084
31
91
About the 4K discussion, I don't think it'll come that soon, we still have 1440 and 1600p to go up to from 1080p,and 4K costs a lot and needs a ton of hardware power.

I remember back in 2005 I had a 1400x900 monitor and the rich boys had the 1080p monitors, and my 720p 36" TV costed me $2K CDN. lol now I can get a 1080p 240Hz (of course we know that isn't real) 50" for $600.

Personally I'm into lower power consuming hardware these days, the GTX750Ti is a great start.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
It just makes sense for the next generation of HPC servers to be big fat integrated processors. Look at Xeon Phi- standalone parallel processor, with directly integrated network fabric. It's all about keeping the processor fed efficiently. Nvidia want the same thing (with ARM cores integrated into Tesla), and AMD can offer the same thing with their APU tech as proved by the next gen consoles. And HPC designs dictate the shape of our high end GPUs. Its a shame that NVidia can't build x86 cores, though.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
Maybe it will go the other way. Perhaps we will all be running dGPUs, with little CPUs inside. Isn't NVidia going that way in the future?

I guess I just don't see any way around the laws of physics. Given a power supply of 300W, and sufficient cooling, and a given process technology, I just don't see IGPs catching up to the performance of dGPUs. Ever.

But perhaps Linus knows some clever hacks around the laws of physics. I'm sure that he's smarter than I am...

Intel has an ever increasing manufacturing lead. Density is very important for GPUs. TSMC's 16nm won't offer any improvement in that area. It won't be until ~2019 that dGPUs will have another real node after 20nm in 2015. In the meantime, Intel is rapidly increasing both density and power/performance. I'm not sure how a 7nm IGP with germanium in 2018 will have any difficulties competing against a 20nm chip with FF+.

There is also price. Nvidia is a fabless company, so TSMC gets the foundry tax. There doesn't seem to be any improvements in the price/transistor at 20nm, 16nm and probably also 10nm. So the price of GPUs will keep increasing.
Meanwhile, Intel, which doesn't have to pay a foundry tax, plans to scale density aggressively in the coming nodes. At 7nm, they'll get another boost in price/transistor from the 450mm wafers.

The cost/transistor deficiency of Nvidia and AMD could be as much as 4-10x for the lower density (16FF+ is 6x less dense than 7nm, but if ARM's slide is true, price/transistor would be less than 28nm, which is 13x less dense; divided by 1.5 to compensate for ~1.5x higher wafer costs than 20nm), 2x for the foundry tax and 1.5x for the lack of 450mm wafers for a grand total of at least a 12x higher price per transistor (16FF+ vs. 7nm in 2018).

Sure, Intel won't release a competitor for a GTX Titan, but that isn't necessary to heavily reduce the dGPU market.
 

crashtech

Lifer
Jan 4, 2013
10,681
2,277
146
Makes sense to eventually integrate CPU and GPU, and who knows, along the way it might prove fruitful to add other types of execution units to the mix along with a very sophisticated instruction unit to make efficient use of the various resources. Could have a couple very fast, deep cores, a couple slightly slower shallower ones, etc. out to the massively parallel GPGPU section.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
Except, the equivalent mid-range dGPU always increases.

5 years from now, APUs & Intel CPU will probably match a GTX760 or R280 class performance. Would it suddenly be powerful enough to obsolete dGPU? Nope. 5 years from now, mid-range would be much much more powerful and 4K will be the norm, at which point a 760 or R280 class APU still won't be able to handle it.
5 years from now? It looks more like 1-2 years from now with Skylake GT4 with 144 Gen9 EUs. Gen10 GT5, if Intel decides to make one, could be serious competitor for dGPUs in 2016.

What about further out, 10 years from now? Same thing applies. 8K may start to come on board or even VR. Your APU with its TitanZ class performance wouldn't be enough for real PC gamers.

20 years from now.. what's the difference exactly? You don't get that gamers demand more and game devs will keep on making more and more demanding games, display tech keeps pushing more pixels etc etc. We'll get to a point where there's entire holodecks like on Star Trek. Guess what? A discrete would still be faster.

The whole idea of iGPU replacing low end is true, sure, but the new dGPU low end is still faster than iGPU. When next gen iGPU comes and matches the current low-end, a new low end is created. Nothing has changed, only the pricepoint for superior performance.

Which comes back to my point, as long as there are people who are willing to pay top money for the best performance, discrete GPU will always hold a huge advantage and won't be obsolete.
Wrong, wrong, wrong. You can't just create a new low-end when iGPUs start eating those GPU's sales. The trend is that Intel's expanding their manufacturing lead, making it harder and harder for dGPUs to have higher performance than Intel's IGPs, because Intel's better, cheaper and smaller transistors allow for more performance per mm². If you do that math, you'll see that it doesn't take a lot of die shrinks for a 300-550mm² to be small enough to fit in a sub-300mm² processor.

So if Nvidia wants to create a new low-end, they'll have to increase the die size (and thermals beyond 300W)... causing your new low-end to have mid-end prices. There ain't no such thing as a free lunch.

I wouldn't be too fast to make predictions 20 years into the future.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
I guess it'll be just like the sound card industry, where today it's impossible to find a discrete sound card.

Wait, it's not impossible to find discrete sound cards at all...
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I guess it'll be just like the sound card industry, where today it's impossible to find a discrete sound card.

Wait, it's not impossible to find discrete sound cards at all...

And they still perform better and provide an advantage in gaming due to their surround sound algorithms, but most people don't buy sound cards they stick with the embedded realtek chipset.

But to be fair to the sound card it wasn't anything to do with cheap chips becoming available or good enough, it was because Microsoft cut off all support for hardware acceleration with the latest version of DirectX, leaving all the surround processing to its software and cutting Creative out. Which is why we don't know have bounced sound waves like we did in 2001 and we see no change. Its not the same cause.
 

mv2devnull

Golden Member
Apr 13, 2010
1,526
160
106
Linus says that the unified is much nicer for the programmer than discrete.

How many times in history superior products have been beaten by something less nice?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
And they still perform better and provide an advantage in gaming due to their surround sound algorithms, but most people don't buy sound cards they stick with the embedded realtek chipset.

But to be fair to the sound card it wasn't anything to do with cheap chips becoming available or good enough, it was because Microsoft cut off all support for hardware acceleration with the latest version of DirectX, leaving all the surround processing to its software and cutting Creative out. Which is why we don't know have bounced sound waves like we did in 2001 and we see no change. Its not the same cause.

I personally stopped getting discrete sound cards because the on board was good enough. The last one I got didn't impress me compared to the on board one I had, and just never bothered with it again.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The current programming model for GPGPU sucks horribly. Its extremely problematic to code in a different language than to the one you are writing your program in and the variant of C that runs on the GPU isn't exactly fantastic. Its also a model of concurrency that isn't very easy to get at, you really need a problem that is suited to it with high computation density on vectors and low branching to make it effective. The combination of the two remains the reason why software for these computation units is slow coming out.

Until the way in which you program these things changes we wont see a big shift to programming for them. The concurrency model programmers will deal with and use it appropriately but the language and API barrier that exists today needs to disappear. Its partly about having the same memory space but more about compatibility at the instruction level. I firmly suspect that the Intel Phi will sweep this market eventually because its possible to run the same code on it as on the main CPU and once it has a unified memory it will likely win out. I can't see the current GPGPU programming style being very popular, I personally hate coding on it as its like going back 20 years in languages.
 

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
How much R&D spending does a soundcard require? I bet it's a heck of a lot less than a GPU... An industry with lower R&D costs can survive on a much smaller market.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
And they still perform better and provide an advantage in gaming due to their surround sound algorithms, but most people don't buy sound cards they stick with the embedded realtek chipset.

Yes, most people don't buy them. But some people do. Linus and others sound more like they're saying that dGPUs are going the way of external FPUs, to be wiped off the market completely.

And there's more of a fair chance for GPUs because

But to be fair to the sound card it wasn't anything to do with cheap chips becoming available or good enough, it was because Microsoft cut off all support for hardware acceleration with the latest version of DirectX, leaving all the surround processing to its software and cutting Creative out. Which is why we don't know have bounced sound waves like we did in 2001 and we see no change. Its not the same cause.

I'm just trying to make the point that you're not going to get everyone's needs with what's integrated in the CPU or motherboard chipset, and there is a real market for add-on cards that do other things or do them better. The demand for dGPUs would have to drop to really close to nothing for them to be pushed out of the market entirely, so long as they offer an appreciable advantage at all.

People are comparing console chips or Xeon Phi to CPU IGPs, it's really not the same scenario because these have totally different footprints, power budgets, single threaded performance and so on vs mainstream CPUs.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I personally stopped getting discrete sound cards because the on board was good enough. The last one I got didn't impress me compared to the on board one I had, and just never bothered with it again.

But a big part of that is that is not really sound quality its features. If I had a sound card that didn't just do 5.1 but did full environmental ray cast sound reflections it would sound awesome and much more natural than todays cards. If I had the sound absorded and reflected differently based on materials again it would enhance the realism. If I could map directly to headphones from the true positions of sounds, attenuate them correctly or provide different HRTF dependent on your ears and headphones all of these things would contribute to wanting to get a sound card. But because all those options are cut off with the current API its just about the quality of the sound produced (more or less, SBX and CMSS still have value just not as much as they could have) and there we have reached the limit of the human ear.

Doesn't mean we are done with sound innovation though, I hear Dolby is working on a 21+ speaker surround mapping for HRTFs for headphones along with a new middleware to utilise it. But DirectX sound needs to be circumvented to really start to get the innovation started again.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
How much R&D spending does a soundcard require? I bet it's a heck of a lot less than a GPU... An industry with lower R&D costs can survive on a much smaller market.

How much R&D spending does making a high end GPU cost if you're making a lower end one to integrate with CPUs, and higher end ones for HPC? That's the question that actually matters.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
But a big part of that is that is not really sound quality its features. If I had a sound card that didn't just do 5.1 but did full environmental ray cast sound reflections it would sound awesome and much more natural than todays cards. If I had the sound absorded and reflected differently based on materials again it would enhance the realism. If I could map directly to headphones from the true positions of sounds, attenuate them correctly or provide different HRTF dependent on your ears and headphones all of these things would contribute to wanting to get a sound card. But because all those options are cut off with the current API its just about the quality of the sound produced (more or less, SBX and CMSS still have value just not as much as they could have) and there we have reached the limit of the human ear.

Doesn't mean we are done with sound innovation though, I hear Dolby is working on a 21+ speaker surround mapping for HRTFs for headphones along with a new middleware to utilise it. But DirectX sound needs to be circumvented to really start to get the innovation started again.

So what you are saying is that when the sound got good enough, they stopped innovating to improve sound. That is what I'm afraid/expect to happen with GPU's. When we reach a point where the IGP is good enough, dev's may stop innovating to push the dGPU. And of course, Linus is also talking about how a lot of innovation can happen as a result of the IGP's advantages, which we likely will be seeing a lot of with the new consoles.
 

infoiltrator

Senior member
Feb 9, 2011
704
0
0
ok, AMD is known for their APUs, and their desktop product line, FX, IS NOT.
ok, Intel is improving iGPUs, but trails AMD performance.
GPUs and games are calling for more fast memory.

The negative, Strong GPUs are pushing thermal, noise, and size limits.
I see this as a seesaw between APU and GPU,
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
So what you are saying is that when the sound got good enough, they stopped innovating to improve sound. That is what I'm afraid/expect to happen with GPU's. When we reach a point where the IGP is good enough, dev's may stop innovating to push the dGPU. And of course, Linus is also talking about how a lot of innovation can happen as a result of the IGP's advantages, which we likely will be seeing a lot of with the new consoles.

On the contrary, I am saying the sound industry was cut off artificially because Microsoft was taking a lot of complaints about operating system crashes and sound card drivers were a big part of it and so they abused their position in the market to cut the innovation off. There are lots of things that can be improved in sound today but its been made enormously harder by Microsoft.