Mass Integration of the CPU?

ImIcarus

Junior Member
Sep 12, 2009
10
0
0
I was reading up on Intel's Jasper Forest recently, and I had a thought. More of a question actually.

If computer hardware makers(especially Intel right now) keeps integrating parts of the motherboard into the CPU, is there a possibility that we may see the disappearance of motherboards in computers all together?
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Still need someplace to plug the cpu into for power and I/O.
Memory and BIOS aren't going to be incorporated either.
 

Eeqmcsq

Senior member
Jan 6, 2009
407
1
0
I was just thinking about what would happen if eventually the northbridge and/or southbridge chips get completely integrated into the CPU. The more I thought about it, the more I realized that it could be a compability nightmare if different CPUs contained different chipset features.

I'll use AMD as an example. Suppose AMD's Bulldozer completely eliminates the northbridge and southbridge chips and everything was integrated on the CPU. Suppose this Bulldozer supports 2 PATA channels for a max 4 PATA devices. So you build your computer and you fill up all 4 PATA devices. So far so good.

Now, suppose AMD releases Bulldozer II, but they decided to drop 1 of the 2 PATA channels, but is otherwise identical to Bulldozer I. As you look to upgrade your CPU, you suddenly realize that you CANNOT upgrade to Bulldozer II, because you lose 2 of your 4 PATA devices. Now you're either stuck with Bulldozer I, or you have to get rid of 2 of your PATA devices. Had the PATA channels been part of a separate chipset, you could have kept your 4 PATA devices AND upgraded the CPU, as we do today.

Though the above was a crude example, do you think this is a possibility as more and more features get integrated into the CPU?
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
System-on-a-chip will be reality sooner than you think. If not for desktops then for mobile devices where expandability is right up there with a 50kg battery on the priority list. All processor makers have embraced this vision.

USB3 will handle any and all peripheral needs for the netbook and commodity desktop niches.

Enthusiasts will end up hacking up server hardware, assuming there is any consumer software to make use of the added power.

In the grim future of ubiquitous, dirt cheap computing there are only gaming consoles.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: Eeqmcsq
I was just thinking about what would happen if eventually the northbridge and/or southbridge chips get completely integrated into the CPU. The more I thought about it, the more I realized that it could be a compability nightmare if different CPUs contained different chipset features.

I'll use AMD as an example. Suppose AMD's Bulldozer completely eliminates the northbridge and southbridge chips and everything was integrated on the CPU. Suppose this Bulldozer supports 2 PATA channels for a max 4 PATA devices. So you build your computer and you fill up all 4 PATA devices. So far so good.

Now, suppose AMD releases Bulldozer II, but they decided to drop 1 of the 2 PATA channels, but is otherwise identical to Bulldozer I. As you look to upgrade your CPU, you suddenly realize that you CANNOT upgrade to Bulldozer II, because you lose 2 of your 4 PATA devices. Now you're either stuck with Bulldozer I, or you have to get rid of 2 of your PATA devices. Had the PATA channels been part of a separate chipset, you could have kept your 4 PATA devices AND upgraded the CPU, as we do today.

Though the above was a crude example, do you think this is a possibility as more and more features get integrated into the CPU?

Presumably such incompatibilities would be comprehended when they decide to name the chip as either belonging to a succession of processors (BD II in your example, AM3/AM2+ compatible PhII 9x5 X4 vs. AM2 compatible PhII 9x0 X4 for a recent example) versus deciding to give it a new processor family name and platform (Lynnfield versus Bloomfield).

Its possible, of course its possible, but backwards compatibility is rarely guaranteed anyways. Few things give us assured backwards compatibility, ISA does this (x86, SSE, 3DNOW, etc) but even that isn't guaranteed, its just something we all assume we can take for granted. I have games that will only run on Win95 for example, they refuse to run in Win95 compatibility mode in WinXP. But then again WinXp is called WinXP and not "Win95 Ver 4.0" or some such that would ever give me an expectation of backwards compatibility.

Its bound to happen, something akin to the scenerio you outlined, I think it would be odd if it didn't at some point.
 

ImIcarus

Junior Member
Sep 12, 2009
10
0
0
Hmm if many of the things in a desktop are integrated, and the rest of the components that can't be integrated is shrunk into a smaller size, could we see the netbook, laptops, and desktops all just blend into one category of "PCs"?

Though it sounds somewhat ridiculous to me... however, there're still things that're ridiculous in the hardware industry that do happen, so I guess nothing is impossible with computer tech.
 

Eeqmcsq

Senior member
Jan 6, 2009
407
1
0
Originally posted by: ImIcarus
Hmm if many of the things in a desktop are integrated, and the rest of the components that can't be integrated is shrunk into a smaller size, could we see the netbook, laptops, and desktops all just blend into one category of "PCs"?.

Not unless desktops come with battery packs. Otherwise, I think there will still be a distinction between laptaps/netbooks and desktops. Also, desktop standards are pretty entrenched right now. ATX, microATX, even miniITX will take a long time to move away from. That's not to mention the add-on cards that goes into PCI or PCIe slots. Those standards aren't going anywhere.

I'm sure there will be all-in-one desktops products available. In fact, we have a couple of them nowadays called "nettops". But desktop PCs as we know it will be around for a while.

 

ImIcarus

Junior Member
Sep 12, 2009
10
0
0
Hmm "nettops"... I wasn't aware of those. I need to do more research. Thanks for the info!

Well, computer component makers are generally moving toward using less watts and energy(I don't know if this actually works, the "use less energy" direction...). So, if makers actually do succeed in progressively shrinking the amount of power needed to power a desktop, you wouldn't need that big of a battery pack would you?

But I see your point how desktops will probably stay for a while though. Standards, standards, standards...
 

Eeqmcsq

Senior member
Jan 6, 2009
407
1
0
Its bound to happen, something akin to the scenerio you outlined, I think it would be odd if it didn't at some point.

Yeah, but I was concerned with scenarios where low end chips have an integrated low end chipset which require a low end motherboard, while high end chips have an integrated high end chipset which require a high end motherboard. Then there's more market fragmentation as low end motherboards can't support high end chips.

Then again, it's possible that separate CPUs and chipsets will coexist with integrated CPU-chipsets. Then it's all a matter of choice and preference for the consumer.
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
Originally posted by: Eeqmcsq
Its bound to happen, something akin to the scenerio you outlined, I think it would be odd if it didn't at some point.

Yeah, but I was concerned with scenarios where low end chips have an integrated low end chipset which require a low end motherboard, while high end chips have an integrated high end chipset which require a high end motherboard. Then there's more market fragmentation as low end motherboards can't support high end chips.

Then again, it's possible that separate CPUs and chipsets will coexist with integrated CPU-chipsets. Then it's all a matter of choice and preference for the consumer.

One thing that you should realize, CPU manufactures are lazy (well, not really, but you will see why I am saying this). Intel doesn't want to have to have new manufacturing plants for every cpu they make, they want to be able to make a generalized version of the CPU and bin it later.

Think of AMD's IMC. from their top high speed chip down to their bottom low speed chip, the circuitry in the IMC is exactly the same. There is no special low level IMC and high level IMC, they are all the same.

Over saturation of products is a bad thing for any company, so I doubt that intel will add another 30 different models just so they can distinguish from chipset capabilities. More likely, the capabilities will be much like instruction extensions (SSE), and be a selling point rather then a distinguishing point.

*edit* BTW I do see something like this happening in the future. I see the CPU swallowing up most of the chips on the mother board and just handling everything itself, rather then having to offload information. The things I don't see going into the cpu are the video card (Sorry, its just too complex and power hungry, I doubt we will see a cpu with a 400W TDP any time soon). Ram (Again, just too big), and of course things like the Bios.
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
Originally posted by: Phynaz
Memory and BIOS aren't going to be incorporated either.

I would think that both of these are inevitable. Portions of BIOS are already integrated in some respects on high-end server CPUs (PAL on the Itanium CPU, and IBM has a similar approach on Power CPUs).

And memory is coming.
http://news.cnet.com/8301-13556_3-10316305-61.html (near the bottom of the page are a couple of paragraphs on IBM embedding eDRAM onto the Power7. It's only 32MB and they are calling it cache, but combining "Moore's Law" with the fact that server features eventually trickle down to the desktop market and I think it's inevitable that memory will be integrated on-die in the future... although probably the distant future.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
OP: The reports say Jasper Forest integrates an IO Hub, but considering the "IO Hub" has PCI Express controller, its just like Lynnfield. The Jasper Forest wafer was the one that was confused with Larrabee wafer, when in reality Intel had Jasper Forest presentation right before Larrabee and Larrabee was censored.

I can't see Southbridge and other non-performance essential hardware being integrated. If the desktop form factor exists, we will see the 2 chip configuration for the rest of the time.

Memory and BIOS aren't going to be incorporated either.

Memory is a different story altogether. There are limits on the number of memory channels you can squeeze on a processor. On chip DRAM is said to be a requirement with processors with 16 cores or more.

1Gbit(128MB) DRAM on 90nm takes 105mm2 die space. I'm thinking by Haswell(2012+) timeframe we'll see large-scale DRAM integration for most consumer processors. Whether they put it off chip like Westmere's IGP or even on-die and going further stacked DRAM by that time is unknown, but I think its reasonable to respect probably 256MB-512MB to be integrated into CPUs. Eventually in few years after that we might see most of the RAM being integrated.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: Eeqmcsq
I was just thinking about what would happen if eventually the northbridge and/or southbridge chips get completely integrated into the CPU. The more I thought about it, the more I realized that it could be a compability nightmare if different CPUs contained different chipset features.

I'll use AMD as an example. Suppose AMD's Bulldozer completely eliminates the northbridge and southbridge chips and everything was integrated on the CPU. Suppose this Bulldozer supports 2 PATA channels for a max 4 PATA devices. So you build your computer and you fill up all 4 PATA devices. So far so good.

Now, suppose AMD releases Bulldozer II, but they decided to drop 1 of the 2 PATA channels, but is otherwise identical to Bulldozer I. As you look to upgrade your CPU, you suddenly realize that you CANNOT upgrade to Bulldozer II, because you lose 2 of your 4 PATA devices. Now you're either stuck with Bulldozer I, or you have to get rid of 2 of your PATA devices. Had the PATA channels been part of a separate chipset, you could have kept your 4 PATA devices AND upgraded the CPU, as we do today.

Though the above was a crude example, do you think this is a possibility as more and more features get integrated into the CPU?

The need for expansion slots (PCI, PCI-e, etc) will probably always be around. Expansion slots provide a "fail safe" if a manufacturer either forgets to integrated something a user wants or thinks the viability of integrating a component onto the CPU would be wasteful for most users.

In your scenario it's really no different than current motherboards which only support a max of 2 IDE devices. What does a user do who wants to use more? They buy an expansion card.

For the short and medium terms, I don't really see every single possible thing being integrated onto the CPU. Hence, they'll need a way to interface with expansion slots. Else, people would have to buy products as is and replace them whole if they need something. I don't see this happening unless the products are extremely cheap.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: IntelUser2000
Memory is a different story altogether. There are limits on the number of memory channels you can squeeze on a processor. On chip DRAM is said to be a requirement with processors with 16 cores or more.

1Gbit(128MB) DRAM on 90nm takes 105mm2 die space. I'm thinking by Haswell(2012+) timeframe we'll see large-scale DRAM integration for most consumer processors. Whether they put it off chip like Westmere's IGP or even on-die and going further stacked DRAM by that time is unknown, but I think its reasonable to respect probably 256MB-512MB to be integrated into CPUs. Eventually in few years after that we might see most of the RAM being integrated.

Given the relatively low cost-adder of stacking dram, even with today's packaging technology, I'd be really really surprised if stacked dram isn't the norm whenever they get serious about sticking dram under the IHS (either MCM or monolithic) on consumer-grade devices in the coming 4yrs.

Not that stacking dram is some form of technology enabler in the capacity we are discussing, just saying that stacking die will likely be the norm by the time the first products make it to our market segment so when we are thinking capacities and timeline for technology intersection we should probably keep this in mind by adding a 8x-12x pre-multiplier to the potential capacity at any given point in time.

(note die stacking simply enables higher effective bit density, it does nothing to intrinsically reduce cost per bit over that of the bit cost of the discrete chips included in the package)

Power7's edram is interesting, its embedded dram by name but it isn't a dram product that you'd fabricate for standalone application and sales. It's kinda like making lemonade from lemons when you really really wanted orange juice, but both are citrus drinks and in a desert you'd settle for lemonade since its better than drinking cactus juice.

Using the already present SOI layer to form the capacitor in eDRAM makes a lot sense if you already got SOI, but its not the trench capacitor dram of which we are accustomed (whether we realized it or not) to thinking about in terms of applying dram density and scaling trends from commodity discrete chip market data.

If you think about it, the primary technological issue with Fusion-type products is the memory interface/bandwidth for the gpu once you shove it into that cpu socket. So if the timing works out so you could put 2-4GB of stacked dram under the IHS interfaced to the integrated GPU then things suddenly have the potential to get interesting again.
 

Eureka

Diamond Member
Sep 6, 2005
3,822
1
81
Originally posted by: v8envy
System-on-a-chip will be reality sooner than you think. If not for desktops then for mobile devices where expandability is right up there with a 50kg battery on the priority list. All processor makers have embraced this vision.

Am I reading that right? 50kg battery? Did I miss something?

 

Forumpanda

Member
Apr 8, 2009
181
0
0
ya I couldn't figure out what typo would lead to 50kg either, closest I got was that maybe he meant 500g, which imo is going to be near the limit for a really mobile computer (something you would bring anywhere you would a book).

Originally posted by: IdontcareIf you think about it, the primary technological issue with Fusion-type products is the memory interface/bandwidth for the GPU once you shove it into that CPU socket. So if the timing works out so you could put 2-4GB of stacked dram under the IHS interfaced to the integrated GPU then things suddenly have the potential to get interesting again.
I agree this will in my opinion be what makes or break a 'fusion'-style approach, if they are able to get enough GPU/CPU shared memory in there (likely need for 1gb+ by 2012), then we would suddenly see a drastic rise in the GPU capabilities of the lowest end systems (assuming a fusion gpu + memory) will still be a cheaper approach than a MB GPU.
Which in turn might lead to more games (and programs using GPGPU) taking advantage of the much better baseline system performance.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Originally posted by: Forumpanda

I agree this will in my opinion be what makes or break a 'fusion'-style approach, if they are able to get enough GPU/CPU shared memory in there (likely need for 1gb+ by 2012), then we would suddenly see a drastic rise in the GPU capabilities of the lowest end systems (assuming a fusion gpu + memory) will still be a cheaper approach than a MB GPU.
Which in turn might lead to more games (and programs using GPGPU) taking advantage of the much better baseline system performance.

IMO, it won't be able to replace the FASTEST of the video cards, but it might be enough to cover mid-range, which by then even less people will use discrete graphics. Which means GPUs are on the way to being a co-processor being integrated like FPUs were a decade ago. I'm pretty sure Nvidia won't like that future.

I think the problem with stacked DRAM approach is just like Hyperthreading. It may not cost so much in terms of die size, but implementing it might be hard. Finding out how to cool both of the dies, or maintaining reliability will be a challenge, even figuring out what materials are needed.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: IntelUser2000
I think the problem with stacked DRAM approach is just like Hyperthreading. It may not cost so much in terms of die size, but implementing it might be hard. Finding out how to cool both of the dies, or maintaining reliability will be a challenge, even figuring out what materials are needed.

How is that problem any different than any other integration challenge that has had to be surmounted to bring products to the market? Its not like stacked dram is the first to pose these kinds of challenges, if there were no challenges then we'd have had the product in our computers 20 yrs ago.

At any rate stacking chips is pretty much a solved issue, they've been doing it with wire-bond now for a couple years already. The only thing new under the sun with the linked elpida dram article I gave above is that it is the first commercial demonstration of using TSV interconnects instead of wire-bond for chip stacking.

TSV will, among other things, dramatically improve thermal conduction throughout the chip stack. We use dummy vias and dummy metal fill design rules all the time to combat both CMP erosion/dishing as well as thermal conductivity.

At any rate we are talking dram...how much of a challenge do you really think it will be to take one stick of dram (8 chips per rank) and stack those chips and then keep it cool? Even at today's clockspeeds we are only talking about adding another 3-6W under the IHS for an 8-chip stacked dram product.

And I'm not talking about stacking the dram onto the logic portions of the cpu (that would be silly of course for the very reasons you highlight), I'm talking about staking it only over the portion of silicon real-estate that the dram already occupies on the chip (if integrated monolithically) or sitting next to it as the integrated GPU/IMC/PCIe does in clarkdale.

Die stacking of actual high-performance MPU's is a whole other cooling challenge though, that I'll agree with any day of the week.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
I see your point. Ultimately, if you want most, if not all of system DRAM on chip, it'll require true stacking, CPU core on top of memory logic or vice versa.

Next few years will bring a radical change. :)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Yeah presumably if you were going to chip stack your dram and your cpu then you'd put the dram stacked to the bottom of the cpu silicon (i.e. as another layer between the cpu and the IHS) as they really need all the real-estate they can gain access to on the outermost metal levels of the cpu for power-distribution and IO.

But MCM'ing an 8Gb chip stack next to the core would pose no serious technological challenge that isn't already faced/solved with MCM'ed GPU/IMC/PCIe and you'd have 1GB of nice high-speed dram sitting all that much closer to your cpu, whether you decide to insert it into the memory hierarchy topology before the IMC or after it is up the cpu designers in determining just how much work they want to do on their side of the equation.

If they interface it on the cpu side of the IMC topology then it would/could act as an L4$ for pre-fetching data from dram thru the IMC and then feeding the L3$, or if it is fast enough it could be used to simply replace the L3$ altogether in similar fashion to how IBM is implementing their eDRAM on power7. (only you'd have 1GB of the stuff, not a paltry 32MB)
 

deimos3428

Senior member
Mar 6, 2009
697
0
0
Originally posted by: ImIcarus
If computer hardware makers(especially Intel right now) keeps integrating parts of the motherboard into the CPU, is there a possibility that we may see the disappearance of motherboards in computers all together?
Sure, why not...we've done it before.

The motherboard can't be completely eliminated as long as you still need a bus to the peripherals, but it could be reduced to a backplane. All the former "motherboard components" could fit on a single card. It would need to have CPU socket(s), a BIOS chip, and DIMM slots at a minimum. Everything else might be on-chip.




 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
FWIW while I was researching something else I came across this article by Goto-san regarding die-stacking dram and cpu.

Original link in Japanese

Google translated to English

Notice this slide from Intel IDF last year, kinda funny they use the term "feed the beast".

There are even IDF slides detailing the various trade-offs to dram die stacked on top or on bottom of the the cpu. Looks like we are about a year late in our conversation here :laugh:

http://pc.watch.impress.co.jp/...008/1226/kaigai_8l.gif
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: ImIcarus
I was reading up on Intel's Jasper Forest recently, and I had a thought. More of a question actually.

If computer hardware makers(especially Intel right now) keeps integrating parts of the motherboard into the CPU, is there a possibility that we may see the disappearance of motherboards in computers all together?

a CPU used to be different components connected via multiple cards... you had the math co processor, floating point units, etc.
Now it is one thing, a CPU...
The GPU is integrating into the CPU, the only thing remaining is the southbridge, power, and ram.
It is rather unlikely those things will integrate into the CPU.

One of the big issues with integration is that there needs to be a benefit. It is beneficial to integrate ram controller, it decreases latency. That makes it a better use for transistors than, say, more floating point units, or another core.

But how is integrating the power infrastructure unto the CPU helping? Do we really need the latency for USB? etc...

We will certainly see more things integrated into the CPU... but not ALL things.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: taltamir
Originally posted by: ImIcarus
I was reading up on Intel's Jasper Forest recently, and I had a thought. More of a question actually.

If computer hardware makers(especially Intel right now) keeps integrating parts of the motherboard into the CPU, is there a possibility that we may see the disappearance of motherboards in computers all together?

a CPU used to be different components connected via multiple cards... you had the math co processor, floating point units, etc.
Now it is one thing, a CPU...
The GPU is integrating into the CPU, the only thing remaining is the southbridge, power, and ram.
It is rather unlikely those things will integrate into the CPU.

One of the big issues with integration is that there needs to be a benefit. It is beneficial to integrate ram controller, it decreases latency. That makes it a better use for transistors than, say, more floating point units, or another core.

But how is integrating the power infrastructure unto the CPU helping? Do we really need the latency for USB? etc...

We will certainly see more things integrated into the CPU... but not ALL things.

To state it differently, the drive for integration has always been product differentiation motivated.

When there is competition there are two things to drive product differentiation - performance and cost.

The creation of the x86-based CPU was driven by cost, not performance, and it was an integer only processor.

The inclusion of the FPU, as well as ISA expansion to include MMX, SSE, 3DNow, etc, were all performance enhancing efforts intended to improve profitability and product differentiation.

As for including things like power and USB IO, true there is little to no practical performance benefits to be derived (and exploited for product differentiation purposes) in doing so, but presumably if there is a cost-reduction opportunity available by way of higher degrees of integration (via moore's law and volume production) then the motivation to pursue such integration will still be present.

After all, look how much of a drag on sales the i7 took compared to the PhII with the price of those x58 mobos. That turned out to be a real advantage in AMD's favor.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
ah, but as i pointed out... there is also RELATIVE cost. every transistor used for USB is a transistor not being used for other, potentially more expensive things.

moore's law should be called "moore's observation" and more accurately "moore's manufacturing/marketing policy"... I think I recall intel actually admitting to having held back before to maintain moore law... aka, they refuse to release something 2x faster if it hasn't been at least 18 months.

If you do double transistors, you COULD cram the entire southbridge unto the same chip as CPU... or you could add another CPU core. Which one gives you better cost and performance is the question. Integrating memory controller was certainly worth it as AMD proved. But originally intel did not believe it would.