Holy Lord... Intels Larrabee ---disclaimer: INQ LINQ

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Not convinved at this point... I doub't the GPU is going anywhere within these next 5 years. There will be a transition phase that problably lasts 3-5 years minimum. I guess I don't see high end descrete graphics cards going away anytime soon.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
* Update
We originally stated that Intel had briefed VRZone, but it transpires that wasn't the case. Apologies.


Doesnt this invalidate thier whole premise?
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Discrete GPUs arent going anywhere before 2010.

Performance of CGPUs just wont be on par with Nvidia's offerings for a long time.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
I don't see how a general purpose processor can possibly beat a specialized processor. After all, there isn't anything in a general purpose processor that can't be in a special purpose processor but there are things in a special purpose processor that isn't in a general purpose processor.

If you want to see how bad a general purpose processor is at rendering, install the MS DirectX sdk. Then go into the control panel and set the default renderer to the software renderer and run your favorite directx game. Prepare to enjoy a luxurious framerate of 0.1 FPS.

If there's any truth to these rumors, I think it's more likely that Intel is simply copying the idea behind Sun's Niagra.
 

Vinnybcfc

Senior member
Nov 9, 2005
216
0
0
Orly?

Intel cant code graphics drivers if the developers lives depended on it

So they could have something twice the speed of a Nvidia or Ati card and it would still be worse
 

hellokeith

Golden Member
Nov 12, 2004
1,664
0
0
The GPU guys have been saying for a while now that the programmability of each successive generation of GPU's are turning them into more general purpose, i.e. CPU's. I'd like to see motherboards take over the connector duties, while processing/video/audio/physics/ai/storage/networking/etc are just plop-in-chip upgrades.
 

thilanliyan

Lifer
Jun 21, 2005
12,052
2,271
126
Originally posted by: hellokeith
I'd like to see motherboards take over the connector duties, while processing/video/audio/physics/ai/storage/networking/etc are just plop-in-chip upgrades.

This is what I'd like to see as well. Just replacing the GPU chip would be fantastic.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
CGPU's will always be catered towards the mainstream IMO but that will be enough to hurt nVidia in the future. If Intel and AMD start pumping out midrange CGPU chips the average joe will have no reason to go out and buy a discrete graphics chip and nVidia will be hurting bad.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
True to a point, but they already do that with integrated video. That's all this will be able to handle for the forseeable future. It's just faster integrated video that Intel gets paid for instead of the chipset manufacturer. The memory bandwith needs of 3d applications pretty much preclude the integrated CPU/GPU tech from being useful for even medium range gaming.
 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
Originally posted by: zephyrprime
I don't see how a general purpose processor can possibly beat a specialized processor. After all, there isn't anything in a general purpose processor that can't be in a special purpose processor but there are things in a special purpose processor that isn't in a general purpose processor.

If you want to see how bad a general purpose processor is at rendering, install the MS DirectX sdk. Then go into the control panel and set the default renderer to the software renderer and run your favorite directx game. Prepare to enjoy a luxurious framerate of 0.1 FPS.

If there's any truth to these rumors, I think it's more likely that Intel is simply copying the idea behind Sun's Niagra.

Not exactly the same as what Intel is proposing. Using the REF now is one processor running everything on one or two cores (including the app, OS, DX runtime, etc). If you throw enough cores dedicated to graphics it could get competitive.

 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: zephyrprime
I don't see how a general purpose processor can possibly beat a specialized processor. After all, there isn't anything in a general purpose processor that can't be in a special purpose processor but there are things in a special purpose processor that isn't in a general purpose processor.

If you want to see how bad a general purpose processor is at rendering, install the MS DirectX sdk. Then go into the control panel and set the default renderer to the software renderer and run your favorite directx game. Prepare to enjoy a luxurious framerate of 0.1 FPS.

If there's any truth to these rumors, I think it's more likely that Intel is simply copying the idea behind Sun's Niagra.

Probably much the same way people see unified shader pipelines better than specialized (traditional) pipelines.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: Shadowmage
This is real. Some Intel folks gave a presentation about this.

Any links to a presentation? Or is that not possible... :(
 

hellokeith

Golden Member
Nov 12, 2004
1,664
0
0
It will impact Nvidia the same way that integrated "HD" audio impacts Creative. Which is to say that it may take away entry-level and mainstream, but not enthusiast or high-end.

edit: If Nvidia sees the writing on the wall, it will get into a licensing IP agreement with Intel, so that Intel does not have to re-invent the wheel for "advanced" integrated graphics.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Integrating it with the CPU is all well and good but where exactly are you going to put 768 MB of fast VRAM like the 8800 GTX has? Solder it onto the motherboard?

At the end of the day it'll still be sharing system RAM so it'll be slow as hell.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: BFG10K
Integrating it with the CPU is all well and good but where exactly are you going to put 768 MB of fast VRAM like the 8800 GTX has? Solder it onto the motherboard?

At the end of the day it'll still be sharing system RAM so it'll be slow as hell.

But if you hadn't noticed, system ram is getting pretty damn fast these days. By 2008, DDR3 or maybe even DDR4 will be mainstream. And remember, it doesn't necessarily have to be faster. It may suffice to be just, wider.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
But if you hadn't noticed, system ram is getting pretty damn fast these days.
Fast compared to a seven year old GF2 GTS. Not fast compared to something like a 8800 GTX.

Also regardless of how fast it is it'll always be shared with the CPU so even if it had the same bandwidth as a 8800 GTX it would still be slower.

By 2008, DDR3 or maybe even DDR4 will be mainstream.
Sure, and by then GPUs will be using DDR6 or possibly even eDRAM. They'll also have 2 GB or even 4 GB totally dedicated to them.

Remember, this sort of sharing is already done on consoles and often a PC GPU has more memory than an entire console.

And remember, it doesn't necessarily have to be faster. It may suffice to be just, wider.
You want to put a 384/512 bit memory controller onto the motherboard? And where exactly are those costs going to go? Against the motherboad, that's where. I'm sure OEMs will be happy about that. :roll:

The only place I see this integration working is in the low-end where it'll replace the likes of GMA because it'll be cheaper to put the GPU on the same die as the CPU.

So in the end this'll be nothing more than a cost-cutting measure which'll make bottom-end PCs a bit cheaper.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: BFG10K
But if you hadn't noticed, system ram is getting pretty damn fast these days.
Fast compared to a seven year old GF2 GTS. Not fast compared to something like a 8800 GTX.

Also regardless of how fast it is it'll always be shared with the CPU so even if it had the same bandwidth as a 8800 GTX it would still be slower.

By 2008, DDR3 or maybe even DDR4 will be mainstream.
Sure, and by then GPUs will be using DDR6 or possibly even eDRAM. They'll also have 2 GB or even 4 GB totally dedicated to them.

Remember, this sort of sharing is already done on consoles and often a PC GPU has more memory than an entire console.

And remember, it doesn't necessarily have to be faster. It may suffice to be just, wider.
You want to put a 384/512 bit memory controller onto the motherboard? And where exactly are those costs going to go? Against the motherboad, that's where. I'm sure OEMs will be happy about that. :roll:

The only place I see this integration working is in the low-end where it'll replace the likes of GMA because it'll be cheaper to put the GPU on the same die as the CPU.

So in the end this'll be nothing more than a cost-cutting measure which'll make bottom-end PCs a bit cheaper.

Wow! No semi company would ever hire a pessimist like you.. ;) They need visionaries, not someone shooting holes in the boat hull. Anyway, we are both actually only speculating.

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Yeah but the RAM slots need to be 512 bit, not to mention the RAM itself.

Such RAM is expensive enough when it's soldered onto a GPU; imagine how much a slot-based solution would cost.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: BFG10K
Yeah but the RAM slots need to be 512 bit, not to mention the RAM itself.

Such RAM is expensive enough when it's soldered onto a GPU; imagine how much a slot-based solution would cost.

So they will find a way to make it cost effective. Look, it's the way things seem to be going, so there are quite literally thousands of engineers between Intel, AMD, and Nvidia working on making something like that happen. Cost be damned, they will do it, and it might be terribly expensive, or moderately expensive. People will still eat it up. Retail and OEM's alike.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,991
32,413
146
Could this be better suited for some form of new consumer electronics devices, as oppossed to the PC?