Linus Torvalds: Discrete GPUs are going away

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
As noted, not precisely winning if you're keeping low end GPUs ahead of iGPU by deleting the relevant chunk of the dGPU market ;)
(However small/large that is.).

The next target is sensibly gameable at 1900*1080, so 750ti/270's etc. Not sure when that'll happen. Broadwell K even?!
 

realibrad

Lifer
Oct 18, 2013
12,337
898
126
To me it all depends on what you will call a dGPU in the future. The reason we have dGPU now, is because the cost of replacing an entire system is far more expensive than replacing the dGPU. The reason hardware is at the price level it is, comes form what consumers want relative to the market constraints. The reason everyone does not buy the cheapest prebuilt systems, is because they don't do enough. Every year, hardware gets faster, and it never seems to get fast enough. This is because every year, we want more. And, if you look at the trends, we expect more from out hardware every year. The thing most people get confused, is when they look at the PC market.

When the Iphone came onto the market, feature phones became less popular. If you looked at the feature phone market, the hardware got cheaper, because demand pushed it there. But, if you look at smart phones and feature phones as being in the same market, then you would see that they get faster and faster each year. Consumers are demanding more power, and eventually, you will likely see the same thing happen with smart phones, as you did with PCs. PCs started by people building them by hand, but then quickly became an all in one system, no assembly required. It was not until the market grew enough that niche demand allowed producers to make systems where you could upgrade. When the market grows and gets established, you will see the ability to upgrade. Cell phones did this, with cases, and memory cards, and external batteries ect.

If IGP is going to die, its in the PC market, and its only because there is not enough demand. But, when it goes away in home computing, its likely to appear in other electronics such as cell phones, or other hand held computing devices.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
Would be kind of ironic if the dGPU is the thing that finally gets Linux some mainstream success with SteamOS/Steambox
 
Last edited:

Scarpozzi

Lifer
Jun 13, 2000
26,391
1,780
126
You can't follow the industry numbers though. Laptop/desktop sales have been on steady decline since tablets and mobile devices have replaced them for the average user.

Power Gamers and graphics designers (not on Macs) will always want the upgradability....even though the cost may be greater. I can see the need for heat distribution off the motherboard and additional fans on daughterboards. At the same time, the question comes: "Is it cheaper to upgrade the video card or simply replace the whole PC?" Years ago, I assumed PCs would go the way of the VCR and become more disposable to drive the industry....much in the same way cars are disposable to people in 1st world countries. It's a very wasteful way to utilize technology and money.

Outside of graphics, I don't see a need for GPUs.
 

Gunbuster

Diamond Member
Oct 9, 1999
6,852
23
81
Surrrrreee the igp that has been increasingly shifted towards mobile optimization for "good enough" and low power is suddenly also going to be optimized for what a dGPU accomplishes with a 300 watt+ power and cooling budget.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
I mean look, Iris Pro graphics are the best iGPU on the market right now, and they are completely destroyed by high end GPUs, 770 and up.

Someone else commented on this without doing any research that it was taking a 770 due to your comment. I would like to point out that the fastest Iris Pro cannot even break 20fps in Crysis on medium at 1080p.

A 640 destroys the thing.

I won't pretend to know the future, but anyone who says that dGPU is dead in 10 years must know something I don't.

4k resolutions are going to push for bigger and better cards. There really is nothing that an iGPU can do better than a dGPU at this point.

Comparisons to consoles are quite moot, as the reality of iGPU for Computers is quite different. They have a specialized board, they are not modular, and the cost for the hardware you receive is not beneficial to the consumer. Currently even with these proprietary parts the system doesn't match most gaming PCs, and I am not talking about the machines most enthusiasts have on this board. I am talking about 660's 7850's. Maybe even lower.

Here is the latest and greatest Iris Pro! 18fps Medium settings.
http://us.hardware.info/reviews/518...ics-benchmarks-igpu-crysis-3-1920x1080-medium

If iGPUs get to the point where they can run Crysis at 4k at high settings, I am in. I don't see that happening for a long while. dGPUs have a huge head start. Process can only shrink so much, we are going to hit a hurdle where improvements have to come from other places and with a space limit related to iGPU that is going to be their biggest roadblock.

To be fair, I am not the standard consumer. I currently am running two 670s in SLI at 5760x1200. I will be buying two new cards from the upcoming generation.

That being said.. I HOPE iGPUs get to a point where they are really good for most people. Even though this would drive up the cost of the high end cards I enjoy, it would mean more attention to PC Gaming. It would mean more games specifically targeted at PCs and better ports. I can live with a price increase on my cards if it means a better overall experience in the long run.
 

Doomguy

Platinum Member
May 28, 2000
2,389
1
81
Linus Torvalds also believes TRIM for SSDs is an unnecessary hack. I'll trust Anand over Linus regarding PC hardware all day.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I read through the first 2 pages, but not the last 2. Hopefully I didn't miss too much.

If the market share of dGPU's shrinks enough, which is already is shrinking, software developers will stop catering to the dGPU and focus on making their games run on IGP/APU's. It does not matter if the dGPU is much faster if software isn't developed around needing that speed.

This isn't about whether dGPU's will always be that much faster. It is about market size. Eventually the dGPU market size will be too small for most developers to worry about and games will be developed on lower end machines that people can easily play with an IGP.

And if you haven't noticed, the more advanced games get, the less impressive each improvement looks. I'm not sure how much more of these small advancements we are going to care about.

No one is saying that dGPU's are dead now, or even in the near future. All that is being said is that is the direction we are headed.
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
dGPUs will continue to coexist like CPU only solutions like Intel Xtreme series and AMD's FX coexist with their respective CPU+GPU solutions, only that it will continue to shrink till it becomes a really niche segment of the market (we can't call today's dGPUs a niche yet). Probably they will end up being only halo products like today's titan black/290x(tx?), and these will only be scaled up from the mobile solution, not the other way around.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
You can't follow the industry numbers though. Laptop/desktop sales have been on steady decline since tablets and mobile devices have replaced them for the average user.

Power Gamers and graphics designers (not on Macs) will always want the upgradability....even though the cost may be greater. I can see the need for heat distribution off the motherboard and additional fans on daughterboards. At the same time, the question comes: "Is it cheaper to upgrade the video card or simply replace the whole PC?" Years ago, I assumed PCs would go the way of the VCR and become more disposable to drive the industry....much in the same way cars are disposable to people in 1st world countries. It's a very wasteful way to utilize technology and money.

Outside of graphics, I don't see a need for GPUs.
Outside of graphics, I don't see a need for graphics processing units.
 

cytg111

Lifer
Mar 17, 2008
25,661
15,160
136
Outside of graphics, I don't see a need for graphics processing units.

- heh (i see what you did there).
I see it like this: 640k is never enough, if there is more it will get pushed and fact is that you can burn off more calories/watts/instructions if you distribute your compute units some space apart, cpu, gpu, fpga, whatever, thats why graphics is never going 100% igp.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Linus should go back to waiting for the Great Pumpkin.

Linux is a great OS but that's what he knows. He's no more an expert on the gaming, CAD, and visualization markets than any other random Joe or Snoopy.
 

njdevilsfan87

Platinum Member
Apr 19, 2007
2,341
264
126
Computational engineer here - nope, dGPUs aren't going to go anywhere. In fact, if you're a new engineer working in computational, if you want to find a job pretty quickly - learn GPU compute. My C++ CFD and diffusion codes run over 10X faster on my Titan than my 3770K using OpenMP.

That, and we still have mining right? :p
(Actually, mining is the perfect example of GPU compute capability and why dGPUs are not going anywhere)

Now whether or not we will have the option to buy a $1000 GTX Titan that can function as a compute/gaming card versus a $5000 K6000 in the future is a different story. But at the rate things are going, I'm not going to have to worry about upgrading my already year and few month old Titans for another 3 years.
 
Last edited:

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
The way I see it...

PS4 and XBOX One (and regrettably iOS/Android) are now the primary gaming development platforms so DGPU's will likely start to go away once PC's have APU's performance parity of the fastest console (PS4).

Once parity of speed is reached, the gaming experience will be generally matched and we'll have reached as far as we need to for the general consumer (who make up the bulk of PC sales).

Remember we're much closer now with integrated video cards catching up to the capabilities of the PS4 than what the performance of integrated video cards in 2005 had (when the XBOX 360 had the fastest GPU in a console).

So if parity can be reached within a few years (maybe sooner because of bare metal coding, stacked memory, smaller processes etc) it's game over for dedicated GPU's. This all goes without saying the upcoming benefits of an APU over a DGPU that Linus mentions (Cache Coherency etc) which will help with non gaming applications.

Yes there will be a niche of gamers who demand native 4K at 120FPS or enough power for VR but I doubt these segments will be large enough to drive continual sales of expensive DGPU's.

There will always be market for DGPU's (just like dedicated sound or NIC cards) but it'll be a fraction of what sells today and I say good riddance.

I look over at my shelf of outdated computer parts and have to chuckle. I have tons of add in cards (Sound, USB, SATA, low end video cards etc) which made sense when used to have hulking ATX tower PC's with multiple slots but this is an outdated model which is going away and the low - mid range video cards are the next casualties, they are the next components that will end up on my grave yard of computer components. They will put to pasture along with my collection of Aureal 3D and Sound Blaster cards.

There's a few upsides to all of this mass integration of hardware components, it's allowing us to use smaller form factors which are generally more energy efficient.
 
Last edited:

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Well the PS4 is a DGPU from AMD of course. They could presumably do something faster if they ever shipped a desktop one with the same memory, or the cache like in the 360/iris pro.

Slightly surprised they haven't tried that actually, just because it would make their APUs make so much more sense. Suppose they just don't have enough of the higher end bits of the PC market to justify the added cost. So we'll be waiting for stacked memory etc.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Except, the equivalent mid-range dGPU always increases.
$200-250 before they tail has been pretty constant, ever since my Voodoo2, with the occasional $300 in the mid 00s when RAM prices were insane.

5 years from now, APUs & Intel CPU will probably match a GTX760 or R280 class performance. Would it suddenly be powerful enough to obsolete dGPU? Nope. 5 years from now, mid-range would be much much more powerful and 4K will be the norm, at which point a 760 or R280 class APU still won't be able to handle it.
That is known, and accepted. That is also why I mentioned IGP killing a price point, not a static performance metric.

The whole idea of iGPU replacing low end is true, sure, but the new dGPU low end is still faster than iGPU
No, it's not.
http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/11

That's faster than a GT 620, around as fast as, maybe a bit faster than, a GT 630, definitely faster than a HD 6450, maybe even in the ballpark of a HD 6570. It takes a $80+ video card to solidly best the IGP, today, and a $100-120 card to really be worth buying over the IGP. 5 years ago, a $40 card could sit and laugh at IGPs, even AMD's, which were pretty good for the time.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Linus Torvalds also believes TRIM for SSDs is an unnecessary hack. I'll trust Anand over Linus regarding PC hardware all day.
You also didn't read why. He's right there, too. TRIM should have been carefully thought out, and implemented right, or just do something like define zeroes as clearable. It could have and should have been done better, because since it wasn't, all the software guys are stuck supporting the original implementations for decades, even if every SSD out by say, 2016, has deterministic queuable TRIM. There's also a GC argument to it, but luckily, that one's been basically solved.

wasnt there something about linus bashing this dude for making the kernel more user friendly some time back (we know how linus bashes(meeeh), it would run up infractions here)
https://www.google.dk/search?q=Why+...rome..69i57&sourceid=chrome&es_sm=93&ie=UTF-8

Clearly not a visionary "year of the linux desktop" kind of guy. (and then who needs discrete??)
That guy is just as much of an egotistical ass as Linux Torvalds is. He's also a very good developer, and came back, but this time not trying to get into mainline at all (though he seems to be in my phone! :D). On anything less than a modern 2C4T CPU, I make a point to get a kernel with CK patches. I personally had to deal with quite a few non-fatal scheduler bugs (pauses, stuttering, processing pegging a CPU for seconds at a time), prior to BFS. Not only is BFS awesome, but by not exhibiting them, it got the mainline guys finding and fixing those bugs, so the default scheduler became better. Being Free doesn't make competition less good to have.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
Yes there will be a niche of gamers who demand native 4K at 120FPS or enough power for VR but I doubt these segments will be large enough to drive continual sales of expensive DGPU's.

I debated long and hard with myself entering this discussion, because we've already had it several times... :hmm:

The thing is that dGPUs has been a niche market ever since Intel introduced Intel "Extreme" Graphics with the 810(E) chipset, 15 years ago. I simply don't see that changing any time soon. Almost every corporate desktop I've seen since has had some form of Intel graphics.

IGPs might take out the low-end of the market, but definitely not the $100+ segment. Also not every desktop is necessarily outfitted with a top-of-the-line IGP. In fact most still use the anemic (but still useful) GT2 configuration (HD4200/4400/4600). If we go further back, Intel pushed the really anemic HD2000/2500 for desktop use, with the slightly better HD3000/4000 being reserved for "special" xxx5 and K-series CPUs with a price premium.

my 2c worth of opinion... :)
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
That's a good point. Perhaps DGPU's are sort of niche already and with iGPU/APU's advancing this niche will continue to get smaller.

About the Intel bottom feeding GPU's. They are light years ahead of what they used to offer and can mostly play modern game with minimum details at 720P settings which is something that just wasn't possible 10-15 years ago. Just google for Microsoft Surface Pro gaming to get a sense of where they're at.

Due to heterogeneous computing (I hate that term), and advances with OpenCL etc I think that the Intel / AMD are spending more money in advancing the GPU faster than CPU so the bottom line X86 processors will continue to get better (and faster) than before for gaming. As the majority of these on die GPU's catch up to the new consoles the need for DGPU's will simply start to vanish.

I debated long and hard with myself entering this discussion, because we've already had it several times... :hmm:

The thing is that dGPUs has been a niche market ever since Intel introduced Intel "Extreme" Graphics with the 810(E) chipset, 15 years ago. I simply don't see that changing any time soon. Almost every corporate desktop I've seen since has had some form of Intel graphics.

IGPs might take out the low-end of the market, but definitely not the $100+ segment. Also not every desktop is necessarily outfitted with a top-of-the-line IGP. In fact most still use the anemic (but still useful) GT2 configuration (HD4200/4400/4600). If we go further back, Intel pushed the really anemic HD2000/2500 for desktop use, with the slightly better HD3000/4000 being reserved for "special" xxx5 and K-series CPUs with a price premium.

my 2c worth of opinion... :)
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Building a an APU with both bleeding edge graphics, CPU, and the memory bandwidth to make it useful would cost a **** ton. I think killing the dGPU would take a further evolution of the APU, or perhaps something like Intel's x86 GPU where fundamentally the main processing cores and graphics cores can be programmed using the same set of ISAs with some cores having the much wider SIMD. IIRC, Intel's problem was the fact that they still needed dedicated texture units and ROPs to still hit performance targets. AMD has the right direction in mind with HSA, they are just horrible at execution and there is no real market for a massive APU that may not have the right balance of CPU and GPU for whatever intended market, or w/e market is too small to invest the research and development cost. Sure, the APUs in Xbone and PS4 make decent examples but they are not bleeding edge in either x86 or graphics performance. However they have good memory configurations to make good use of their capabilities.

Not considering feature set and architecture improvements, we had dedicated GPUs on par with Kaveri 6 years ago (Radeon 4870). And I'm talking about a top end Kaveri variant that will be the most expensive, not the ho-dum 384 GCN SP version that is certainly more ubiquitous. The 5+ year gap will only go away when the market isn't big enough for dGPUs to be profitable.
 
Last edited:

crashtech

Lifer
Jan 4, 2013
10,681
2,277
146
Coming out of left field with this, but I have wondered for years why the GPU has to either be way over on a card in a slot connected by an expansion bus (a paradigm held over from the 80s) or as part of the CPU die. Why can't there be two sockets on the motherboard, one for GPU and one for CPU, with GDDR5 between them, minimizing trace lengths and enabling tighter coupling?
 

Mand

Senior member
Jan 13, 2014
664
0
0
Coming out of left field with this, but I have wondered for years why the GPU has to either be way over on a card in a slot connected by an expansion bus (a paradigm held over from the 80s) or as part of the CPU die. Why can't there be two sockets on the motherboard, one for GPU and one for CPU, with GDDR5 between them, minimizing trace lengths and enabling tighter coupling?

Mostly, I think it has to do with all of the other things that are on a graphics card than the GPU chip, mostly relating to power delivery. A graphics card can easily out-consume the entire rest of a system as far as power goes - if it were socketed, the motherboard would have to manage that power. There's just probably not enough real estate on a typical motherboard to add all that.

So, it's not so much that there's a specific performance reason why going over the PCIe bus is a good idea, but rather one of logistics.