Linus Torvalds: Discrete GPUs are going away

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Mostly, I think it has to do with all of the other things that are on a graphics card than the GPU chip, mostly relating to power delivery. A graphics card can easily out-consume the entire rest of a system as far as power goes - if it were socketed, the motherboard would have to manage that power. There's just probably not enough real estate on a typical motherboard to add all that.
.

Not really. I was running kaveri a10-7850k overclocked and overvolted (by accident!) and it was consuming (system APU+mobo+RAM+ssd) 290 Watts at the wall. Mobo was a mid-range Asrock FM2A88 Extreme 4+, which has 5+1 VRM. It was doing so for just a couple of seconds before I noticed it and killed furmark and OCCT stress tests. But the fact that it didn't go pop right away says enough.

I was running it OCed so it took 180-200 Watts at the wall (system consumption) for a long stress test sessions without any problems.

I imagine high-end board would have no problem what so ever with 200+Watt apu

Sadly, it needs a little hack to disable kaveri tdp limit every time the system boots.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Well, yeah, a massively overclocked CPU/APU can out-draw a stock GPU. But you take a high-end graphics card that starts at a TDP of 250 to 275 watts, and then start overclocking it, and you can dwarf what the rest of the system can pull.

The biggest reason for not having socketed GPUs is probably expansion capability. Adding another GPU or three in CF/SLI is easy to do over PCIe. It'd be rather challenging with GPU sockets in a motherboard. Either you don't have them at all and give up all CF/SLI compatibility, or you put in enough of them, and the necessary supporting power delivery components, in order to support them and now you have a stupidly expensive motherboard most people won't fully utilize.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
True, the whole point of expansion slots is to have the ability to expand and scale the system to the user's needs, which is what PC is all about.
Fixed, all-in-one design have "console" written all over it.

But its probably where the majority of users are dragging the industry.

Also, 1,55Volts did that ;) Quite a durable piece of silicon that kaveri is ^^
 

crashtech

Lifer
Jan 4, 2013
10,681
2,277
146
Mostly, I think it has to do with all of the other things that are on a graphics card than the GPU chip, mostly relating to power delivery. A graphics card can easily out-consume the entire rest of a system as far as power goes - if it were socketed, the motherboard would have to manage that power. There's just probably not enough real estate on a typical motherboard to add all that.

So, it's not so much that there's a specific performance reason why going over the PCIe bus is a good idea, but rather one of logistics.

Well, iGPUs also consume power, and that is handled by the motherboard. Adding power phases to motherboards is not too problematic, after all, there are in existence motherboards specced to handle 220W CPUs. I get the feeling the reason it's not done is because it would essentially require a new form factor and a big shift in thinking, but it would enable the tighter coupling that makes APUs so interesting. while keeping the two parts separate for customization/upgrading.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Well, iGPUs also consume power, and that is handled by the motherboard. Adding power phases to motherboards is not too problematic, after all, there are in existence motherboards specced to handle 220W CPUs.


And how much extra space do those motherboards have to handle another 250-1000 watts worth of GPU?

It's not so much that they can't handle the power, but rather that they can't handle more than they're already designed for. I mean, just think about all the stuff that's on a typical graphics card, the external power connectors, the VRMs, the capacitor banks...all of that has to go somewhere, because everything on the motherboard is already being used for the things on the motherboard.
 

crashtech

Lifer
Jan 4, 2013
10,681
2,277
146
Maybe it's a crazy idea, but if the trend is to start thinking of GPU as general purpose processor, we should treat it like one. I tend to think SLI/Crossfire is a special case even among enthusiasts, in which case add-on cards would still be appropriate.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
I dont think discrete GPUs are dying. I believe the selection will be shrinking.

IGPs will eat up the low end and most of the mid end GPUs due to the fact that IGPs are getting powerful enough to play games on decent settings. It takes a ton of money on R&D to be able to engineer different GPU models, and since the "Average Joe" will spend his money on a cheaper and cooler and more efficient notebook with IGP, less money will come in to the hands of discrete GPU manufacturers. Which again mean less money for R&D. Which means that Nvidia and AMD will in the future concentrate on the high end that true gamers need to play high resolution, which IGPs wont ever be able to do due to the resource it takes and the fact that games become more complicated and resource heavy too.

I don`t mind this at all. Just leave the high end alone.

In the future we will go from replaceable CPUs to soldered CPUs. From dGPUs to dGPUs only in the high end while everything else is IGPs. And finally to an All in package where only APU exist, and you get to select if you want more of the CPU speed or GPU power from many different APUs offered by 2-3 companies.
Wouldnt surprise me if AMD comes out strong here because they do both, while Nvidia is bought by IBM or Intel to be able to compete against AMD.
 
Last edited:

tollingalong

Member
Jun 26, 2014
101
0
0
I dont think discrete GPUs are dying. I believe the selection will be shrinking.

This.

Odds are dGPUs will still be around for niche but not the norm. I think Linux is mostly correct but it looks like AMD will most likely become a more dominant force if this does happen given their x86 license.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Well, yeah, a massively overclocked CPU/APU can out-draw a stock GPU. But you take a high-end graphics card that starts at a TDP of 250 to 275 watts, and then start overclocking it, and you can dwarf what the rest of the system can pull.
Midrange GPUs are rarely above 200W stock. Beasts taking spec-limited power will be the last to go.
 

Stable

Banned
Jun 16, 2014
23
0
0
It will all depend on whether x86 will remain the standard for hardcore gamers. Right now people casually game on tablets with ARM SoCs but hardcore gaming on AAA titles remain the domain of x86/ windows and now consoles that are just specialized x86 APUs. dGPU today is still very necessary for x86 gaming. The very highest end APUs (4770R/7850k) can barely manage what consoles can at 3x the price. If x86 remains the gamer staple I really don't see dGPU going anywhere. Making an APU that matches the highest end GPU/CPU combos is either impossible or so expensive it's never been done. If people are willing to pay 1500 for a dGPU alone why wouldn't nvidia and AMD keep making them?
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,386
32
91
Remember folks, sub-$100 processors are going to kill off the rest of the CPU industry. Think about it: A Celeron 300A was $150 at release in 1998 and was a bargain processor. $150 in 1998 is $218 today. If Intel can no longer sell the Celeron at $218 and now have to price it at $45, how ever can they sell anything more expensive than it? There mustn't be a market for anything above it. And if there was, AMD owns the market right above it. They've successfully undercut the i5 and i7, so who ever would go for those? You should buy AMD stock since Intel doesn't know what they're doing!

Anyway, this guy's funny. He acts as though die space is free. Does Intel put the Iris Pro in Pentium? Does the A4 have the same GPU as the A10? Most people don't need 3D acceleration, so every mm^2 that goes into gaming performance is profit lost in the sale of an office or Facebook PC. Now, there are undoubtedly advantages to having basic 3D functionality (you don't have someone buying a brand new computer, thinking that means it can play the latest games, loading one up, then thinking they just bought utter crap when it can't even get 1FPS) but to think this extends infinitely is retarded. Just as it isn't economically feasible to give every processor the cache of a Xeon E7, it's not feasible to give every processor a top-tier IGP.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Wouldnt surprise me if AMD comes out strong here because they do both, while Nvidia is bought by IBM or Intel to be able to compete against AMD.
If Denver doesn't suck, nVidia won't need to be bought by anyone. If they can solidify themselves with some high-volume stuff, like tablets, and maintain HPC parts sales at high profits, they'll be fine. The higher-volume lower-margin stuff has what's been hard for them, as of late.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
It will all depend on whether x86 will remain the standard for hardcore gamers. Right now people casually game on tablets with ARM SoCs but hardcore gaming on AAA titles remain the domain of x86/ windows and now consoles that are just specialized x86 APUs. dGPU today is still very necessary for x86 gaming.
Yes, but not high-end. For every AAA title that cost way too much and needs a high-end GPU to shine, well, Hell, that same AAA title works fine on a $100-150 card, using <100W peak. With on-package fast RAM, would a 150W AMD APU be reasonable? What about a 100W Intel quad-core?

But then, back to the main point, there are also games coming out, thanks to idie pushes, and Steam providing cheap publishing, focusing on gameplay more than graphics, and using low resolution graphics, that are cheaper to produce, focusing more on the aesthetic than trying to chase photorealism. I love this movement. Not every one of these games is good, but the good ones don't have to have tens of millions of dollars to create assets, nor high-end GPUs (high-end CPUs are still often a requirement).

I only know one other gamer, today, that spends more than $150 on their video cards (of course that's partly because I've spec'ed and built his last few PCs, but nonetheless...:)). The details can be lowered, and FXAA/SMAA/etc. applied, making the lower midrange cards fine for 1080p, if you're cool with that. To what degree the Steam hardware survey can be trusted, this doesn't seem to be abnormal (not to say $200-300 cards are abnormal either, but it's not like they are a standard requirement).

The very highest end APUs (4770R/7850k) can barely manage what consoles can at 3x the price.
More like 1.5x the price, but I'm not even sure they can compare to the consoles, if you're dealing with a GPU-limited game more than a CPU-limited one. Those APUs, FI, aren't limited, on the desktop, by TDP, even using hot AMD CPU and GPU designs (compared to Intel and nV), nearly as much as by dual-channel DDR3. They could have 2-3x the GPU performance fairly easily, within AMD's regular TDP limits, assuming fast on-package RAM could be integrated, and the next shrink would allow more die space to be given to the GPU (as has been the trend).

Keep in mind that it's about whether the integrated can increase in performance at a faster rate than discrete, for a given total cost, not that there is ever a point where the total performance will have satiated our wants (while that might be possible one day, rasterizers on silicon won't be what does it).
 

Stable

Banned
Jun 16, 2014
23
0
0
Yes, but not high-end. For every AAA title that cost way too much and needs a high-end GPU to shine, well, Hell, that same AAA title works fine on a $100-150 card, using <100W peak. With on-package fast RAM, would a 150W AMD APU be reasonable? What about a 100W Intel quad-core?



But then, back to the main point, there are also games coming out, thanks to idie pushes, and Steam providing cheap publishing, focusing on gameplay more than graphics, and using low resolution graphics, that are cheaper to produce, focusing more on the aesthetic than trying to chase photorealism. I love this movement. Not every one of these games is good, but the good ones don't have to have tens of millions of dollars to create assets, nor high-end GPUs (high-end CPUs are still often a requirement).



I only know one other gamer, today, that spends more than $150 on their video cards (of course that's partly because I've spec'ed and built his last few PCs, but nonetheless...:)). The details can be lowered, and FXAA/SMAA/etc. applied, making the lower midrange cards fine for 1080p, if you're cool with that. To what degree the Steam hardware survey can be trusted, this doesn't seem to be abnormal (not to say $200-300 cards are abnormal either, but it's not like they are a standard requirement).



More like 1.5x the price, but I'm not even sure they can compare to the consoles, if you're dealing with a GPU-limited game more than a CPU-limited one. Those APUs, FI, aren't limited, on the desktop, by TDP, even using hot AMD CPU and GPU designs (compared to Intel and nV), nearly as much as by dual-channel DDR3. They could have 2-3x the GPU performance fairly easily, within AMD's regular TDP limits, assuming fast on-package RAM could be integrated, and the next shrink would allow more die space to be given to the GPU (as has been the trend).



Keep in mind that it's about whether the integrated can increase in performance at a faster rate than discrete, for a given total cost, not that there is ever a point where the total performance will have satiated our wants (while that might be possible one day, rasterizers on silicon won't be what does it).


If they can fix the bandwidth problem then I agree they could effectively replace the sub $300 dGPU price point, but because of heat, space (both physical and die space), and chiefly cost will prevent iGPU from taking over the gaming and compute high-end in the neat future. iGPU has come a long way but with profits increasing i think its very premature to declare one will replace another.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Remember folks, sub-$100 processors are going to kill off the rest of the CPU industry. Think about it: A Celeron 300A was $150 at release in 1998 and was a bargain processor. $150 in 1998 is $218 today. If Intel can no longer sell the Celeron at $218 and now have to price it at $45, how ever can they sell anything more expensive than it? There mustn't be a market for anything above it. And if there was, AMD owns the market right above it. They've successfully undercut the i5 and i7, so who ever would go for those? You should buy AMD stock since Intel doesn't know what they're doing!
There were already fine AMD CPUs when the 300A came out, and had been for years. You just had to be careful about your motherboard selection.

Now, where are the mainstream socket $1000 and over CPUs? That's what cheap mainstream CPUs "killed." And, they didn't do it by pulling the rug out from under the high-end CPUs. They did it by there being enough performance in, and demand for, the midrange, that such fancy CPUs made no sense to try to make and sell to halfway-sane enthusiasts. The best they can make tends to be, at best, maybe 25% faster, stock, for games that love HT, than a nice new Core i5. We're not nearly there yet, but the future is that, as xtors and bandwidth get relatively cheaper.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
If they can fix the bandwidth problem then I agree they could effectively replace the sub $300 dGPU price point, but because of heat, space (both physical and die space), and chiefly cost will prevent iGPU from taking over the gaming and compute high-end in the neat future. iGPU has come a long way but with profits increasing i think its very premature to declare one will replace another.
Intel and nVidia have already shown they are working on exactly that, with on-package DRAM, which saves pins, PCB space, RAM module costs, and allows for arbitrary bandwidth (however many pads and wires they are willing to implement). Previously, this was the sort of thing the likes of IBM did, as it was crazy expensive. Now, slow RAM on-package is so cheap it's in $50 computers, and high-performance is, from all indications, going to be cheap enough to be viable for mainstream PC parts within the next few years, and affordable for high-end goods much sooner.

Also, it will not only enable more bandwidth for IGPs, but also more cheap bandwidth for dGPUs. Imagine, FI, many hundreds of GB/s for the GPU with local DRAM of maybe 512MB-2GB, and then a 64-bit or 128-bit external interface, where the less-used data goes--since the IO needs will be lighter, the lower bandwidth off-chip won't hurt, but could save money, leaving you to get more performance for your $X than if there were a 256b+ RAM interface and 8+ RAM packages on the PCB. If this stuff works out, it's not like video cards will stagnate for the near future. They will keep getting better for generations, until those worth buying over IGP are just too expensive to maintain their R&D costs, or until CPU/[GP]GPU/RAM integration reaches a point where IGP performs better more often than not (high-end PCs might use multi-socket motherboards, like in ye olden days, FI, instead of having several video cards). If the latter works out, instead of the former, nVidia will have a good chance to keep on laughing all the way to the bank for decades to come.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
Remember folks, sub-$100 processors are going to kill off the rest of the CPU industry. Think about it: A Celeron 300A was $150 at release in 1998 and was a bargain processor. $150 in 1998 is $218 today. If Intel can no longer sell the Celeron at $218 and now have to price it at $45, how ever can they sell anything more expensive than it? There mustn't be a market for anything above it. And if there was, AMD owns the market right above it. They've successfully undercut the i5 and i7, so who ever would go for those? You should buy AMD stock since Intel doesn't know what they're doing!

Anyway, this guy's funny. He acts as though die space is free. Does Intel put the Iris Pro in Pentium? Does the A4 have the same GPU as the A10? Most people don't need 3D acceleration, so every mm^2 that goes into gaming performance is profit lost in the sale of an office or Facebook PC. Now, there are undoubtedly advantages to having basic 3D functionality (you don't have someone buying a brand new computer, thinking that means it can play the latest games, loading one up, then thinking they just bought utter crap when it can't even get 1FPS) but to think this extends infinitely is retarded. Just as it isn't economically feasible to give every processor the cache of a Xeon E7, it's not feasible to give every processor a top-tier IGP.

The best part is that he basically tries to argue that the CPU portions of the console APUs are mid-range, so a gaming PC with those APUs is feasible. I think he's a bit confused about who's on drugs.
 
Last edited:
Feb 19, 2009
10,457
10
76
That's faster than a GT 620, around as fast as, maybe a bit faster than, a GT 630, definitely faster than a HD 6450, maybe even in the ballpark of a HD 6570. It takes a $80+ video card to solidly best the IGP, today, and a $100-120 card to really be worth buying over the IGP. 5 years ago, a $40 card could sit and laugh at IGPs, even AMD's, which were pretty good for the time.

You mention price points but failed here. Iris Pro is ridiculously expensive compared to even mid-range cards. Its also not available for system builders but only through OEMs, currently only popular on select Apple models, which they charge a massive premium over non Iris Pro models.

We'll revisit this discussion in 10 years time and all the doom and gloomers who claim dGPU is going to be obsolete will have mud in their faces, and I'll be enjoying my PC gaming on 8K monitors while iGPU will struggle at 4k on everything LOW.
 

Revolution 11

Senior member
Jun 2, 2011
952
79
91
Remember folks, sub-$100 processors are going to kill off the rest of the CPU industry. Think about it: A Celeron 300A was $150 at release in 1998 and was a bargain processor. $150 in 1998 is $218 today. If Intel can no longer sell the Celeron at $218 and now have to price it at $45, how ever can they sell anything more expensive than it? There mustn't be a market for anything above it. And if there was, AMD owns the market right above it. They've successfully undercut the i5 and i7, so who ever would go for those? You should buy AMD stock since Intel doesn't know what they're doing!

Anyway, this guy's funny. He acts as though die space is free. Does Intel put the Iris Pro in Pentium? Does the A4 have the same GPU as the A10? Most people don't need 3D acceleration, so every mm^2 that goes into gaming performance is profit lost in the sale of an office or Facebook PC. Now, there are undoubtedly advantages to having basic 3D functionality (you don't have someone buying a brand new computer, thinking that means it can play the latest games, loading one up, then thinking they just bought utter crap when it can't even get 1FPS) but to think this extends infinitely is retarded. Just as it isn't economically feasible to give every processor the cache of a Xeon E7, it's not feasible to give every processor a top-tier IGP.
+1

Thanks for the entertaining reply. Sad that so many people don't get this.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
You mention price points but failed here. Iris Pro is ridiculously expensive compared to even mid-range cards. Its also not available for system builders but only through OEMs, currently only popular on select Apple models, which they charge a massive premium over non Iris Pro models.
Fine, I'll spend more than 30 seconds on Google:
http://www.techpowerup.com/reviews/Intel/Core_i7_4770K_Haswell_GPU/9.html
OMG! The performance differences are almost the same (because it's basically the same thing)! It needs a GT 640, which is anywhere from $80-100, to be substantially superior (but, unless all you do is Photoshop, you'd be stupid not to get a GTX 750 or GTX 750 Ti), and leaves no use, performance-wise, for ~$50 cards.

We'll revisit this discussion in 10 years time and all the doom and gloomers who claim dGPU is going to be obsolete will have mud in their faces, and I'll be enjoying my PC gaming on 8K monitors while iGPU will struggle at 4k on everything LOW.
Regardless of what things are like in 10 years, I'm not sure who the doom and gloomers are. Cost savings from on-package fast memory, ever-increasing efficiency, lower-cost barrier to entry for users, and ever-increasing performance for a given cost, over time, is hardly a case of doom and gloom.

I'll get whatever I need to for whatever I'm willing to pay, to play what's out in 10 years. Heck, by then I might even decide to stop using tilesets for Nethack, while I'm at it :p.

P.S. Look at this way:
Today, CPUs seem to have gone up about $20 from a few years ago, for about the same thing (IE, i3 v. i3, i5 v. i5). Today, to match Haswell, you'll need about a GT 630. To match a good A10 w/ fast RAM, a GT 640 (some 640s will still have an edge, but not by much).
Players of RTSes, MMOs, etc., previously had to buy a HD 5670, HD 6670, GT 440, etc., just to get by, at low settings.
Those two price differences combined make for about $50 less. While Haswell and A10s can technically play some single-player action games, I am trying to be realistic. Well, $50 can get you...
From a Pentium to an i3 or A10.
From an i3 to an i5.
From a stock i5 to an overclockable i5 K (just barely, with CPU, HSF, mobo), or stock Xeon E3-123xVy.
From that i5 K or Xeon E3 to a stock i7.
From a bare-bones mobo and 4GB RAM stick to a fast 8GB kit (important for IGP) and mobo with decent IO.
Almost from 8GB RAM to 16GB RAM, should you do more than game and have a use.
From a 1TB HDD to a 256GB SSD.
From a 2TB HDD to a 120GB SSD and 1TB HDD.
From a cheap PSU on sale to a nice one not on sale.
From a cheap case to a case you'd fall in love with.
From your old beat up mouse to a new one.
A value headset, if you could use one and don't have it.
...and so on.

Better IGP means, for those capable of using it, better overall value. These people already play on low, and think people spending $500+ on video cards are wasting their money (I sure would be, but I couldn't manage with what they put up with, at least not for very long :)). Now, 4K, we'll just have to see about, as today, no mainstream card has even 1/4 of what it will take to handle 4K (a GTX 770 can be pretty much saturated at 1080P, today), so I doubt that will come down from being high-end for some time.

Meanwhile, the same tech advances that could enable even faster IGPs, making that more like $75 or $100 over the next several years, will largely not affect you, or me, buying more expensive video cards, for quite some time, in the worst case. In the best case, they will mean even better cards for the money. Likely, nV and AMD will stop making the very lowest-end chips, and make cheap cards with salvage dies (I'm pretty sure today's GT 620 and GT 630 are just that, anyway). Also, since nV and AMD both know this is the way of the future, they will be planning for such things, and have already been shifting their R&D to account for it. By making the integrated and discrete GPUs use the same designs, just lightly tweaked (in nV's case, mobile, PC AIB, and server designs), most of what benefits the next gen's tablet benefits the next gen's Titan equivalent, since high-end GPUs are very much TDP and space limited.

It's not a future vision of graphics going to pot, but of being able to put higher-performance GPUs on the CPU than can be put on a card for the same cost to the buyer, enabled by being able to, relatively cheaply, get around the 15+ year old bandwidth limitations imposed by off-chip RAM interfaces. The die space to dedicate to it alone will take a shrink or two, only to still not be enough to beat midrange cards. So it's not something that's going to happen overnight, even if the technology to do it is as successful and affordable as it is alleged to be. All the companies involved will have plenty of time to scheme and adapt, including AMD finding some way to keep getting cash, for fear of an Intel monopoly :).
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
There is obviously some truth in what Torvalds is saying, but what everyone has to understand is the timeframe of this happening. It isn't right now, it's what will happen in the future: I wouldn't read into it too much. The market is shifting towards mobility, and there is no doubt that iGPU is eating in low end discrete sales which in turn has a negative effect on the R+D efforts of both AMD and nvidia, which is why Nvidia branched out into mobile and the professional markets (nvidia is obviously doing incredibly well in the pro market) and AMD is attempting to create a value proposition with their APUs. As far as mobile APUs go (note the distinction, MOBILE), the main problem I see is that their performance per watt isn't quite up to snuff with intel for mobile products since intel has 7W full blown core mobile CPUs, and AMD APUs bring too much compromise in terms of poor CPU performance. Customers see mobile APUs from AMD as a value product only and OEMs refuse to create good designs with good displays and SSDs, because mobile APUs have either poor CPU performance and those with good CPU performance have poor battery life. Unfortunately for AMD the only good ultrabook and mobile designs have intel CPUs, because ultimately OEMs do not want to compromise on CPU performance. AMD for their part is trying, very hard though. AMD is making an effort to branch out just like NV is and that is precisely what they need to do.

Despite the inevitable happening years down the road, I don't think dGPU is in imminent danger. We're talking about something that will last probably at least 5 more years, but the scale and volume of sales will lower over that time. Of this there is absolutely no doubt. R+D budgets for full blown dGPUs will continue to lower, iGPU will continue to get better and better, and ultimately iGPU is all about mobile. And mobile is the CRUX of the market right now, not desktop, and this is why iGPU is so important. iGPU brings things in terms of performance per watt that no mobile dGPU can touch, not even close really. And that's why over time, yes, iGPU will eat into discrete GPU sales more and more over time.

But, again, I wouldn't read into it too much. We're talking about something years down the road. It is an inevitable happening - iGPU *will* eventually catch up to low end and mid range mobile dGPU. And this is a big deal because the market is shifting to mobile, all in ones, and smaller and smaller form factors such as Brix Pro type devices. And sorry to say, a 300W discrete PCIE GPU isnt' going to cut it for something that small. And these devices will proliferate more and more in coming years, there is no doubt about this. But...NOT imminent. Not overnight. The way I see it, full blown desktop GPUs still have a good several years left. This is not something that is immediate. If I had to guess, 3-5 years down the road is when this will be a huge deal. Right now? PCIE GPUs still have a lot of life left even though the sales are way down compared to 2-3 years ago.
 
Last edited:

Bryf50

Golden Member
Nov 11, 2006
1,429
51
91

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
Remember folks, sub-$100 processors are going to kill off the rest of the CPU industry.

Well, ARM and Atom may kill off x64 Core CPUs, at least at the low end. There is talk that Intel will kill off Core-based Celeron and Pentium CPUs, and fill those low-end brands with purely Atom CPUs.

Whether this will cause Intel to slowly kill off their higher-end CPUs, would be interesting to watch, since it's approximately equivalent to what's happening between IGPs ("good enough"), versus the best, or at least, better, dGPUs. ("Good enough" Atoms, eliminating the market for Core-based CPUs, first at the low-end, then ... high-end too?)
 

SlickR12345

Senior member
Jan 9, 2010
542
44
91
www.clubvalenciacf.com
You didn't get what Linus was saying. You can't separate CPU and GPU any more when simulation models gets more realistic, it's just matter of time when iGPU will be better alternative to separate GPU.

Future will be massive iGPU with hbm memory, only thing that holds iGPU's today is lack of memory bandwith and that problem is solvable.

But its still stupid to say that. We are more likely to see GPU-cpu with graphics that has a small cpu attached to them, rather than cpu's with attached gpu's.

GPU's are way more powerful and flexible than cpu's.

And I still don't see it happening anytime in the next 10 years.

I mean we are still operating on 3 year old GPU technologies and the same 4 year CPU technologies.

The from the 2000 series cpu's to today's 4000 series there are minor differences.

The big change was from Q series CPU's to I 900 series and then there was quite big difference in the 2000 series as well.

So things are moving slowly.