Linus Torvalds: Discrete GPUs are going away

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
Do you still believe that discrete GPU's have a future?

What do you base that ludicrous belief on? Drugs?

Because everything says that IGP's are getting to be "good enough" for a big enough swath of the market (and that very much includes most gamers - look at the game consoles, for chrissake! You are aware that modern game consoles are IGP's, right?) that the discrete GPU model isn't financially viable in the long run.

So your argument is exactly the wrong way around. It's not that the IGP's can't have a adequate market size, it's the discrete GPU's that have market size problems.

And the IGP's are very much moving in the direction of the GPU being more of an general accelerator (AMD calls the combination "APU"s, obviously). And one of the big advantages of integration (apart from just the traditional advantages of fewer chips etc) is that it makes it much easier to share cache hierarchies and be much more tightly coupled at a software level too. Sharing the virtual address space between GPU and CPU threads means less need for copying, and cache coherency makes a lot of things easier and more likely to work well.

We've seen this before, outside of graphics. Sure, you can use MPI on a cluster, and get great performance for some very specific loads. But ask yourself why everybody ends up wanting SMP in the end anyway. The cluster people were simply wrong when they tried to convince people how hardware cache coherency is too expensive. It's just too complicated to come up with efficient programming in a cluster environment.

The exact same is true in GPU's too. People have spent tons of effort into working around the cluster problems, and lots of the graphical libraries and interfaces (think OpenGL) are basically the equivalent of MPI. But look at the direction the industry is actually going: thanks to integration it actually starts making sense to look at tighter couplings not just on a hardware level, but on a software level. Which is why you see all the vendors starting to bring out their "close to metal" models - when you can do memory allocations that "just work" for both the CPU and the GPU, and can pass pointers around, the whole model changes.

And it changes for the better. It's more efficient.

Discrete GPU's are a historical artifact. They're going away. They are inferior technology, and there isn't a big enough market to support them.

Linus

http://www.realworldtech.com/forum/?threadid=141700&curpostid=141714
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
whatever linus, there hasn't been a single real integrated design yet.

especially calling the consoles igp's is weird, they're more like a dedicated gpu with a little cpu parasiting on the memory.

I guess it'll be integrated some day, but I'm not holding my breath.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
Yes, yes, dGPUs are dying, x86 is dying, desktops are dying, laptops are dying, keyboards and mice are dying, and in 5-10 years consumer computers will be a thing of the past, and then servers will swtich 100% to ARM SoCs soon after. We've heard it all before.
 

PCunicorn

Member
Oct 18, 2013
63
0
61
Linus seems like a bit of a dick to me. I do believe dGPUs have a future, maybe not a terribly long one as iGPUs are certainly getting better, but I'm thinking low and mid range GPUs have at least a decade left of a decent future. I mean look, Iris Pro graphics are the best iGPU on the market right now, and they are completely destroyed by high end GPUs, 770 and up.

I think high end dGPUs will be on the market for the next 20 years. It would be impossible to have some insane CFX/SLI setup without them. And, you have to upgrade your CPU every time you want a GPU upgrade you have to upgrade your CPU which I guarantee you will cost a lot (think, if iGPUs become beefy, so will the price as that's basically a GPU and CPU in one) plus you might even have to upgrade your motherboard.

But, I guess I must just be on drugs.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Linus seems like a bit of a dick to me. I do believe dGPUs have a future, maybe not a terribly long one as iGPUs are certainly getting better, but I'm thinking low and mid range GPUs have at least a decade left of a decent future. I mean look, Iris Pro graphics are the best iGPU on the market right now, and they are completely destroyed by high end GPUs, 770 and up.

I think high end dGPUs will be on the market for the next 20 years. It would be impossible to have some insane CFX/SLI setup without them. And, you have to upgrade your CPU every time you want a GPU upgrade you have to upgrade your CPU which I guarantee you will cost a lot (think, if iGPUs become beefy, so will the price as that's basically a GPU and CPU in one) plus you might even have to upgrade your motherboard.

But, I guess I must just be on drugs.

The problem is, if we use your example, nVidia cant live on selling GTX770 and up only. The market is simply too small.

Also when you consider the size. Remember Intelf or example got a massive node lead. Companies that can afford 20nm is shinking too. Broadcom for example already gave up and stays with 28nm.

And if you start to compare something like 28nm dGPU with 14nm IGP or 20nm with 10nm. Then its getting ugly fast.

My GTX680 for example on an optimized 14nm would be roughly around 50W and a mere 75mm2.

But the question is when the volume of dGPUs is so small that there is no money in developing new. Then we can get 5 rebrands or more of the best highend cards.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
without saying when it's going to happen, there is nothing new here...

for now, consoles have less than half of what a single GPU high end card can offer...

a lot of problems to solve before it can happen,
 

Carson Dyle

Diamond Member
Jul 2, 2012
8,173
524
126
But the question is when the volume of dGPUs is so small that there is no money in developing new. Then we can get 5 rebrands or more of the best highend cards.

When the market shrinks, some manufacturers will get out and the remaining ones will be forced to raise prices as they chase limited sales. Doesn't seem like a terribly complicated market dynamic.

Discrete GPUs will not be going away any time soon, but they've become increasingly niche market components.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
whatever linus, there hasn't been a single real integrated design yet.

especially calling the consoles igp's is weird, they're more like a dedicated gpu with a little cpu parasiting on the memory.

I guess it'll be integrated some day, but I'm not holding my breath.

Maybe it will go the other way. Perhaps we will all be running dGPUs, with little CPUs inside. Isn't NVidia going that way in the future?

I guess I just don't see any way around the laws of physics. Given a power supply of 300W, and sufficient cooling, and a given process technology, I just don't see IGPs catching up to the performance of dGPUs. Ever.

But perhaps Linus knows some clever hacks around the laws of physics. I'm sure that he's smarter than I am...
 
Feb 19, 2009
10,457
10
76
As long as there are people who are willing to pay to have the best, there will be discrete GPUs.

All the more true now and in the future as 4K become the norm.
 
Aug 11, 2008
10,451
642
126
When the market shrinks, some manufacturers will get out and the remaining ones will be forced to raise prices as they chase limited sales. Doesn't seem like a terribly complicated market dynamic.

Discrete GPUs will not be going away any time soon, but they've become increasingly niche market components.

This seems like the most logical scenario to me. I just dont see how it is going to be financially unfeasible to sell professional level graphics cards for a thousand dollars on up. And with every new generation of igp we hear that it will be the killer chip that does away with discrete, but there is not yet an igp from either AMD or intel that matches a lowly, what 3 year old now, HD7750.

One can project great performance all they want, but I will believe it when I see it. Not to mention that games are becoming more demanding and discrete gpus should see a big boost when they finally go to 20 nm.
 

x3sphere

Senior member
Jul 22, 2009
722
24
81
www.exophase.com
Some of what he's saying makes sense, but I doubt dGPUs are going away anytime soon. They'll still be around a decade from now, most likely. 30 years from now? Not so sure.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Its certainly getting closer. The consoles released with something that was mid range level performance and the current Intel CPUs are dramatically better than those before them. But right now what seems to be happening is as the iGPU doubles in performance so does the GPU, because they are both moving forward at about the same pace. There is a reason for that, they are both dependent on the same underlying silicon process improvements to give them more transistors to make the design more parallel.

So the current gap from iGPU to dGPU is basically staying the same. If the CPU throws all its new transistors into its GPU it wont be able to catch up, but it will need to throw some of its budget at the CPU side as well and thus the distance will roughly be maintained. That will remain the case unless something fundamentally shifts, maybe stacked memory will make a big difference, maybe we will have a dramatic decrease at some point in leakage so that CPUs aren't so thermally limited or maybe the low sales in the cheaper and entry level cards will take their toll on the existing manufacturers.

But discrete GPUs are selling well and only looking at the progress iGPUs makes ignores the exact same improvements driven by the same process in discrete cards. Just because CPUs stopped improving in performance does not mean its happening in GPUs (it doesn't appear to be yet). Eventually they will die, they might be killed more by the move to mobile devices than anything else, but iGPUs will never catch dGPUs, the distance will be maintained more or less as we see it today.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
What is more likely to happen is that developers find use for a second GPU before the IGP gets good enough to kill the dedicated GPU.

Can't use that iGP to render games if it's busy running physics and other junk...
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
dGPUs aren't going anywhere for a long time. Even the most powerful iGPUs don't even measure up to the lowest, cheapest, tier of dGPUs. So long as people want to play PC games, which is a multibillion dollar industry that grows every year, dGPUs will still be around.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
its always pretty funny when software guys are put on a pedestal as some sort of authority on where hardware is going, we see the same type of hearsay and conjecture from the likes of Carmack, and even though these guys haven't really done anything truly significant in their respective fields since the 90s, people take their word as some sort of law, and it doesn't matter if they get it only partially right or even completely wrong.

at any rate, saying the dGPU is going away is like saying silicon based computing is going away
 

dn7309

Senior member
Dec 5, 2012
469
0
76
I don't mind seeing descete GPU going away, provided the integrated one can outperform it.
 

TrulyUncouth

Senior member
Jul 16, 2013
213
0
76
That's not gonna happen anytime soon.

I think it's Linus that should be going away.

Agreed. If anything I would predict a resurgence in graphics cards as VR and 4K take off. Need a lot of GPU muscle to render a game at >90fps in 2560x1600 or 4K in the near future. VR will be a huge driver for graphics card sales.

And Mr. Torvalds is known for shooting off at the mouth half-cocked.
 

xpea

Senior member
Feb 14, 2014
458
156
116
dGPU are here for a least 10 years. As everybody said, problem with iGPU is performance. The very best (think X1 and PS4) are barely good for 720 / 900p, and not potent enough for 1080p. If we talk about the big volume mainstream models, like HD4000, they miserably fail at even the lowest settings at 720p !
But unfortunately for these iGPU, the bar keeps rising. Now the standard is 1080p (that they can't touch, remember), but industry is moving fast towards 4k. Japan is even experimenting now 8k !!! iGPU will need at least 6 years to play 4K at medium settings, then 8k will knock at the door... endless story...
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
I see this if I read it correctly as a software vs hardware as igpu's have the same trannys ,hard to see a 30x30mm gpu the equal of a 550 sq mm gpu ,and there won't be any cpu's at 550mm 8billion +trannys ever.
any one know the true cost ratio of programing a bare metal console game to a pc port that has a gpu.
pick any game say bf4 in dollars and cents