Mass Integration of the CPU?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MODEL3

Senior member
Jul 22, 2009
528
0
0
As long there is a PC games market (even with this market penetration (current console cycle/ports strategy) the need to have performance/enthusiast GPUs will be there.

So the power/die requirements will prohibit the integration for these performance/enthusiast GPUs.
Certainly i think the CPU IGP (with integrated DRAM) will be able to offer an alternative solution to mainstream level GPUs (like the current IGP offers today an alternative solution to entry level/value level GPUs)

The DRAM integration is a route that GPUs will go also, not only CPUs imo.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Originally posted by: MODEL3
As long there is a PC games market (even with this market penetration (current console cycle/ports strategy) the need to have performance/enthusiast GPUs will be there.

So the power/die requirements will prohibit the integration for these performance/enthusiast GPUs.
Certainly i think the CPU IGP (with integrated DRAM) will be able to offer an alternative solution to mainstream level GPUs (like the current IGP offers today an alternative solution to entry level/value level GPUs)

The DRAM integration is a route that GPUs will go also, not only CPUs imo.

The biggest problem with that is whether GPU-only guys like Nvidia will survive such an onslaught. They will be selling only highest end GPUs that cost significantly more. Nvidia must realize this. So are the rest of the people relying on them.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: MODEL3
As long there is a PC games market (even with this market penetration (current console cycle/ports strategy) the need to have performance/enthusiast GPUs will be there.

So the power/die requirements will prohibit the integration for these performance/enthusiast GPUs.
Certainly i think the CPU IGP (with integrated DRAM) will be able to offer an alternative solution to mainstream level GPUs (like the current IGP offers today an alternative solution to entry level/value level GPUs)

The DRAM integration is a route that GPUs will go also, not only CPUs imo.

there is a finite amount of graphical improvements that can be made. We already have photorealistic rendering in holliwood, and near photorealistic in video games. There is not all that much improvement needed to obsolete the VIDEO card.

This is why CUDA, OpenCL, and DX11 all focus on the compute part of the equasion, specifically on physics. with PhysX and the DX11 hardware physics...
 

lopri

Elite Member
Jul 27, 2002
13,327
708
126
Wouldn't platformization be more attractive to the big boys, before the time for SoC comes? I fear the day when PC becomes like consoles.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Even if when full "SoC" days come, I feel there will still be non-SoC high perfomance parts. As long as the high end desktop sector survives, integrating everything non-performance essential like Southbridge is a waste of space that could have been used to integrate more performance for the CPU.

Pretty sure we'll see more of the multi-die approach like on Nehalem in the future. SoC for low power and cheap markets, and not SoC for high performance CPUs. Even if the traditional desktop sector dies off, server CPUs will still be required.

(Some people might remember this post from the other thread. I originally meant to be had it here. I don't know how I made that mistake. Sorry)
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
Originally posted by: IntelUser2000
Originally posted by: MODEL3
As long there is a PC games market (even with this market penetration (current console cycle/ports strategy) the need to have performance/enthusiast GPUs will be there.

So the power/die requirements will prohibit the integration for these performance/enthusiast GPUs.
Certainly i think the CPU IGP (with integrated DRAM) will be able to offer an alternative solution to mainstream level GPUs (like the current IGP offers today an alternative solution to entry level/value level GPUs)

The DRAM integration is a route that GPUs will go also, not only CPUs imo.

The biggest problem with that is whether GPU-only guys like Nvidia will survive such an onslaught. They will be selling only highest end GPUs that cost significantly more. Nvidia must realize this. So are the rest of the people relying on them.

Yep, i agree. (my point of view was more about the technical side rather than the commercial side of DRAM integration...)
I suppose Nvidia realized this before you and me?
The thing is they made some strategic mistakes.
Tough times ahead for NV (but i think until 2015, NV has a detailed analysis / forecast of the upcoming situation)


 

MODEL3

Senior member
Jul 22, 2009
528
0
0
Originally posted by: taltamir
Originally posted by: MODEL3
As long there is a PC games market (even with this market penetration (current console cycle/ports strategy) the need to have performance/enthusiast GPUs will be there.

So the power/die requirements will prohibit the integration for these performance/enthusiast GPUs.
Certainly i think the CPU IGP (with integrated DRAM) will be able to offer an alternative solution to mainstream level GPUs (like the current IGP offers today an alternative solution to entry level/value level GPUs)

The DRAM integration is a route that GPUs will go also, not only CPUs imo.

there is a finite amount of graphical improvements that can be made. We already have photorealistic rendering in holliwood, and near photorealistic in video games. There is not all that much improvement needed to obsolete the VIDEO card.

This is why CUDA, OpenCL, and DX11 all focus on the compute part of the equasion, specifically on physics. with PhysX and the DX11 hardware physics...

Yep, this way i see it also (longterm...)
That's why i said as long the there is a PC games market (like the current model and with this kind of market penetration as today).
I don't think that CUDA, OpenCL, and DX11 (talking about current versions) will change much the current model (why people are buying video cards...) it's just the first step...
It will take a while...
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: taltamir
Originally posted by: MODEL3
As long there is a PC games market (even with this market penetration (current console cycle/ports strategy) the need to have performance/enthusiast GPUs will be there.

So the power/die requirements will prohibit the integration for these performance/enthusiast GPUs.
Certainly i think the CPU IGP (with integrated DRAM) will be able to offer an alternative solution to mainstream level GPUs (like the current IGP offers today an alternative solution to entry level/value level GPUs)

The DRAM integration is a route that GPUs will go also, not only CPUs imo.

there is a finite amount of graphical improvements that can be made. We already have photorealistic rendering in holliwood, and near photorealistic in video games. There is not all that much improvement needed to obsolete the VIDEO card.

This is why CUDA, OpenCL, and DX11 all focus on the compute part of the equasion, specifically on physics. with PhysX and the DX11 hardware physics...

There is one more metric for Hollywood needed...REAL-TIME photorealism with complete ray-tracing on unlimited light sources.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Viditor
Originally posted by: taltamir
Originally posted by: MODEL3
As long there is a PC games market (even with this market penetration (current console cycle/ports strategy) the need to have performance/enthusiast GPUs will be there.

So the power/die requirements will prohibit the integration for these performance/enthusiast GPUs.
Certainly i think the CPU IGP (with integrated DRAM) will be able to offer an alternative solution to mainstream level GPUs (like the current IGP offers today an alternative solution to entry level/value level GPUs)

The DRAM integration is a route that GPUs will go also, not only CPUs imo.

there is a finite amount of graphical improvements that can be made. We already have photorealistic rendering in holliwood, and near photorealistic in video games. There is not all that much improvement needed to obsolete the VIDEO card.

This is why CUDA, OpenCL, and DX11 all focus on the compute part of the equasion, specifically on physics. with PhysX and the DX11 hardware physics...

There is one more metric for Hollywood needed...REAL-TIME photorealism with complete ray-tracing on unlimited light sources.

True 3D is coming, not this 3D projected onto a 2D surface stuff we play with now.

(holograms are kinda an over-used descriptor, call it what you like, the prototype stuff I saw in action a couple years ago was kuh-razy awesome)

That will increase the computing demands by another exponent all over again.
 

deimos3428

Senior member
Mar 6, 2009
697
0
0
The problem with integrating the GPU into the CPU is that someone will always want to replace the integrated GPU with a better one. So what about turning the idea on it's head, and integrating an additional CPU core or two into each high-end GPU?

Instead of a "5870x2" card, you might have a "5870 + Phenom II X2 550" card. (Or whatever, you get the general idea...)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: MODEL3
Originally posted by: taltamir
Originally posted by: MODEL3
As long there is a PC games market (even with this market penetration (current console cycle/ports strategy) the need to have performance/enthusiast GPUs will be there.

So the power/die requirements will prohibit the integration for these performance/enthusiast GPUs.
Certainly i think the CPU IGP (with integrated DRAM) will be able to offer an alternative solution to mainstream level GPUs (like the current IGP offers today an alternative solution to entry level/value level GPUs)

The DRAM integration is a route that GPUs will go also, not only CPUs imo.

there is a finite amount of graphical improvements that can be made. We already have photorealistic rendering in holliwood, and near photorealistic in video games. There is not all that much improvement needed to obsolete the VIDEO card.

This is why CUDA, OpenCL, and DX11 all focus on the compute part of the equasion, specifically on physics. with PhysX and the DX11 hardware physics...

Yep, this way i see it also (longterm...)
That's why i said as long the there is a PC games market (like the current model and with this kind of market penetration as today).
I don't think that CUDA, OpenCL, and DX11 (talking about current versions) will change much the current model (why people are buying video cards...) it's just the first step...
It will take a while...

yea, i am also talking in "the next few decades"

There is one more metric for Hollywood needed...REAL-TIME photorealism with complete ray-tracing on unlimited light sources.
An infinite amount of sources will take an infinite amount of time to calculate. It just needs to be good enough so that a human viewer can't tell the difference.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: deimos3428
The problem with integrating the GPU into the CPU is that someone will always want to replace the integrated GPU with a better one. So what about turning the idea on it's head, and integrating an additional CPU core or two into each high-end GPU?

Instead of a "5870x2" card, you might have a "5870 + Phenom II X2 550" card. (Or whatever, you get the general idea...)

I don't know about that. We don't feel the same "can't upgrade one without upgrading it all" issues when it comes to the FPU that was once upon a time a separate IC on your mobo.

Also the same issue doesn't effect game consoles, so it might not be such an issue with fusion-type products in the PC markets.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Why would I want to replace the integrated GPU with a "better one" if the integrated GPU gives me photo-realistic rendering?
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Well, in a way the integrated GPU is upgradeable by changing the CPU. Far better than replacing the entire motherboard as before. Of course you pay for the CPU price but who says you can't sell the older CPU on craiglist/ebay. :)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: IntelUser2000
Well, in a way the integrated GPU is upgradeable by changing the CPU. Far better than replacing the entire motherboard as before. Of course you pay for the CPU price but who says you can't sell the older CPU on craiglist/ebay. :)

true... while selling an old video card is a PITA and selling an old mobo an absolute nightmare.

CPU = +Small + light + durable + expensive = excellent sell
Ram = +Small + light + durable - Cheap = ok sell
GPU = -Large - Heavy +/- so so durability + expensive = PITA but can be profitable
Mobo = -huge - heavy -poor durability -tends to have problems -cheap = not worth selling

Of course, if old enough none of these are worth selling.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Idontcare
FWIW while I was researching something else I came across this article by Goto-san regarding die-stacking dram and cpu.

Original link in Japanese

Google translated to English

Notice this slide from Intel IDF last year, kinda funny they use the term "feed the beast".

There are even IDF slides detailing the various trade-offs to dram die stacked on top or on bottom of the the cpu. Looks like we are about a year late in our conversation here :laugh:

http://pc.watch.impress.co.jp/...008/1226/kaigai_8l.gif


IMEC tapes out Etna 3-D stacking chip

The IMEC research center (Leuven, Belgium) and its 3-D integration partners have taped-out Etna, a 3-D chip that integrates a standard DRAM chip on top of a logic IC. The 3-D stack consists of a 25-micron thick logic die on top of which a commercial DRAM is stacked using through-silicon vias (TSVs) and micro-bumps.

Partners in IMEC's 3-D integration include Amkor, Infineon, Intel, Micron, NEC, NXP, Panasonic, Qualcomm, Samsung, STMicroelectronics, Texas Instruments and TSMC.

The 3-D demonstrator includes heaters to test the impact of hotspot on DRAM refresh times. And, the chip contains test structures for monitoring thermo-mechanical stress, electro-static discharge hazards, electrical characteristics of TSVs and micro-bumps and fault models for TSVs.

http://www.eetimes.com/news/se...l;?articleID=220300650

(note-to-self, search keyword: 3D stacking)