Nvidia at work on combined CPU with graphic - On 65nm in 2008

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
See my 2/10 post intel's *larrabie* is coming :p
===================
see my 10/20 post . . . MS too :p
http://www.nytimes.com/2006/10/19/techn...t.html?_r=1&ref=technology&oref=slogin
___________________________________

i found this interesting:

Nvidia at work on combined CPU with graphic...On 65nm in 2008
Nvidia's . . . acquir[ed] some folk from Stexar, a company that was known for its X86 marchitectural expertise.

And now we hear that development is underway at Nvidia's just-announced Portland, Oregon, Design Center, where chip folk are beaving away on 45 nanometre designs.

The project should bear fruit sometime in 2008, as Nvidia prepares plans to compete with Intel and AMD on the blended graphic and CPU concept.

This is what OEMs want in 2008. Sixty-five or 45 nanometre processes make this possible and AMD and Intel are going to do it, so Nvidia doesn?t have much choice.
:Q[/quote]

 

NoStateofMind

Diamond Member
Oct 14, 2005
9,711
6
76
Good thinking on posting it here, wonder how you came up with that idea? ;)

The truth of the matter is, this gets interesting real fast with 3 players in the graphics and CPU market.
 

Hyperlite

Diamond Member
May 25, 2004
5,664
2
76
Originally posted by: apoppin

i

The project should bear fruit sometime in 2008, as Nvidia prepares plans to compete with Intel and AMD on the blended graphic and CPU concept.
:Q[/quote]

[/quote]

so NV is planning to compete with intel? hmm....does not compute...

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: PC Surgeon
Good thinking on posting it here, wonder how you came up with that idea? ;)

The truth of the matter is, this gets interesting real fast with 3 players in the graphics and CPU market.
some brilliant poster in Video suggested it

thanks

it also should have it's own thread in video ;)

this IS big news and a look at the future . . . nvidia is a very smart corporation and will do whatever it takes to survive and prosper.

i don't spend a lot of time here . . . unfortunately for me [probably good for you guys]
[i an waiting to upgrade *everything* - next year ... i'll be back . . . humble and beggin'] :p
 

The-Noid

Diamond Member
Nov 16, 2005
3,117
0
76
Nvidia is doing this because they want to get bought by Intel. Although GPU's are much better than cpu's at doing repetitive tasks they need a lot more logic to do load balancing and computation of different ordinal and rational behaviors. If they do make a processor it won't be for a high-end system. This is meant to be a low-end computer with integrated graphics and probably a cpu/video solution that is soldered directly to the motherboard.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Yoxxy
Nvidia is doing this because they want to get bought by Intel. Although GPU's are much better than cpu's at doing repetitive tasks they need a lot more logic to do load balancing and computation of different ordinal and rational behaviors. If they do make a processor it won't be for a high-end system. This is meant to be a low-end computer with integrated graphics and probably a cpu/video solution that is soldered directly to the motherboard.

of course . . . in the beginning . . . everything starts 'simple' and in this case to reduce costs. nvidia is 'following' . . . but very a aggressive move.... and i don't think it is to be 'acquired'. i think they would hate that and not work well in a 'takeover' situation . . . and probably never happen under their current CEO.

however . . . it appears to be the 'future' . . . it is speculated that AMD acquired ATi to do just this: merge the CPU/GPU and specialize the platform toward specifics.
 

nyker96

Diamond Member
Apr 19, 2005
5,630
2
81
I think with quad or above, the CPU can be tasked to do tons GPU or Physics jobs in a game.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: apoppin
Originally posted by: Yoxxy
Nvidia is doing this because they want to get bought by Intel. Although GPU's are much better than cpu's at doing repetitive tasks they need a lot more logic to do load balancing and computation of different ordinal and rational behaviors. If they do make a processor it won't be for a high-end system. This is meant to be a low-end computer with integrated graphics and probably a cpu/video solution that is soldered directly to the motherboard.

of course . . . in the beginning . . . everything starts 'simple' and in this case to reduce costs. nvidia is 'following' . . . but very a aggressive move.... and i don't think it is to be 'acquired'. i think they would hate that and not work well in a 'takeover' situation . . . and probably never happen under their current CEO.

however . . . it appears to be the 'future' . . . it is speculated that AMD acquired ATi to do just this: merge the CPU/GPU and specialize the platform toward specifics.

I agree with apoppin here. NVIDIA is a leader not a follower, and quite honestly, they rock at everything they do. A few years ago, people were saying the same about NV's move into the chipset game. While nForce1 wasn't all that great, nForce eventually became the hands down best AMD chipset (since nForce2), and now they make chipsets that rival Intel's own.

Another thing to consider: NVIDIA may have a lot of work to do on the CPU side, but Intel has a whole lot of work to do on the GPU side. AMD/ATI may take the initial lead on this, but they are behind NV and Intel with regards to chipsets.
 

The-Noid

Diamond Member
Nov 16, 2005
3,117
0
76
NV chipsets rival Intel? There is a reason why 590 was never released. I had a Lan Party 590 ES that did 300 FSB max. Bit ridiculous. Look at the 975 (450 is easy) and the 965 chipset where (500+ is easy). Nforce is a good AMD chipset but we will see what happens with the 680i comes to retail in 2 weeks. Until then NV has alot to proove when it comes to chipsets.

As for Nvidia being good at everything they do. Look at mobile video accelerators where ATI holds 85% of a business that NV has desperately tried to win over, but they never seem to be able to compete in something that is outside their realm.

With listening to the INQ too, this could either all be made up or end up being a cpu for a POS device or such. We will see...
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: nyker96
I think with quad or above, the CPU can be tasked to do tons GPU or Physics jobs in a game.

Not with general purpose cores, hence why Intel and AMD are exploring integrating GPU functionality.
 

Hyperlite

Diamond Member
May 25, 2004
5,664
2
76
Originally posted by: aka1nas
Originally posted by: nyker96
I think with quad or above, the CPU can be tasked to do tons GPU or Physics jobs in a game.

Not with general purpose cores, hence why Intel and AMD are exploring integrating GPU functionality.

right, you can't just expect a cpu core to fill the role of a gpu.... a cpu core's graphics abilities would be (for now) limited to physics calculations and things of that nature...GPU's are designed the way they are for a reason.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Yoxxy
NV chipsets rival Intel? There is a reason why 590 was never released. I had a Lan Party 590 ES that did 300 FSB max. Bit ridiculous. Look at the 975 (450 is easy) and the 965 chipset where (500+ is easy). Nforce is a good AMD chipset but we will see what happens with the 680i comes to retail in 2 weeks. Until then NV has alot to proove when it comes to chipsets.

You are basing your view on a overclocker's perspective only. Not everyone is an overclocker. However, NVIDIA is smart enough to realize that to sell an SLI chipset in any quantities it must also be a good overclocker since the SLI buyer tends to be a performance buyer as well. At stock nForce4 SLI IE is an excellent chipset, as it is feature rich and just as stable as any Intel chipset I've ever used.

As for Nvidia being good at everything they do. Look at mobile video accelerators where ATI holds 85% of a business that NV has desperately tried to win over, but they never seem to be able to compete in something that is outside their realm.

Intel has a much larger market share of the desktop gpu than NVIDA as well... Do you mean to tell me that marketshare is indicative of who makes the better product and that you'd rather play HL2 on Intel integrated graphics than a GeForce 7900GTX? I honestly can't compare NV and ATI mobile products because I just don't have access to them.

My point was not that NV makes the best of everything they produce, but that they generally have a competitive product. I don't think that their motive is to get bought by Intel.
 

The-Noid

Diamond Member
Nov 16, 2005
3,117
0
76
GPU's are much more powerful than cpu's for what they do. Look at what Folding@home has done with the x19xx series. Quad core is not going to be the endall for things like some people think. We will see if games are even multithreaded to support quadcores by next year. My bet is we will have a handful of games ready for quadcore and dx10. The idea of a single unit that does CPU/GPU calculations is a great idea, it will just never be a high-end part. Look at today, a dedicated solution will always beat out something designed to do multiple things, and if this ever comes to a point where it is better, it will be at a high cost premium.

The only way I see this to be a feasable solution is if it has a really low TDP. This could be a great part for a mobile product or htpc, but it will never be a end all product for desktops, there is simply not enough room for that big a die.

As for integrated parts and mobile. NV parts are not competitive on a price or a functionality scale compared to ATI. Which is shown by marketshare, and comparing ATI, NV solutions to Intel integrated is comparing apples to oranges. The products in mobile devices are the same. Integrated graphics and dedicated solutions are not.

This will not be the golden cpu/gpu for 500$ that beats out dedicated parts, although I have a feeling that the fanbois all will want it to be.

The end part is Nvidia products are slower compared to their intel counterparts in motherboards at same speeds. They have a slow memory controller, and are based on AMD technology. A assymetric memory controller has not been shown to provide good performance on Intel Chipsets as of yet. Look at memory benchmarks of the C19 and RD600 compared to the 975x. Although I hope this changes in the future, becuase on AM2 my sticks will do 1225 mhz @ 4-4-4-4, which is always nice to see.
 

mamisano

Platinum Member
Mar 12, 2000
2,045
0
76
So, does this mean a CPU for PCs or for other applications? Who knows, maybe they are planning to compete on the console side with a CPU and GPU or some other application.

What would it take to use a GPU based processor in a similar aspect as the Transmeta "Codemorphing" VLIW design? It would certainly be powerful enough.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Yoxxy
The end part is Nvidia products are slower compared to their intel counterparts in motherboards at same speeds. They have a slow memory controller, and are based on AMD technology. A assymetric memory controller has not been shown to provide good performance on Intel Chipsets as of yet. Look at memory benchmarks of the C19 and RD600 compared to the 975x. Although I hope this changes in the future, becuase on AM2 my sticks will do 1225 mhz @ 4-4-4-4, which is always nice to see.

RD600 hasnt been released yet (not to mention many rumours claming that these boards are going to be rare). Most nforce590 intel edition hasnt been released either because these care getting a face lift.

The upcoming nforce 680i (i stands for intel edition) is going to surprise some people. nVIDIA already promised us with an updated C19 chip hence the C55 chip. From AT themselves, this is going to be launched simutaneously with G80. The nforce 680i is going to be a hell of a OCer (vr-zone claims it reaches ~510mhz on FSB) , feature packed where intel cant match them, not to mention SLi and supporting up to 1333 FSB (for the upcoming core2 dup refreshes).

 

The-Noid

Diamond Member
Nov 16, 2005
3,117
0
76
Where are these rumors coming from? RD600 will be released in full, and not limited me thinks. Read dfi-street or xs where benchmarks are up. The C55 may be a good chip we will see.
 

RichUK

Lifer
Feb 14, 2005
10,320
672
126
Wasn?t there a rumour over Intels proposed acquisition of nVidia. I?m pretty sure I read it over here.
 

DrMrLordX

Lifer
Apr 27, 2000
21,637
10,855
136
There was a rumor about Intel buying Nvidia, but nothing has come of it thus far.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: RichUK
Wasn?t there a rumour over Intels proposed acquisition of nVidia. I?m pretty sure I read it over here.

it was elsewhere

actually theInq broke the ATi acquisition news first . . . they DO get some things right.

ANYway, i feel nvidia would be foolish to not explore cpu-gpu hybreds . . . they are a very smart company . . . and they can read the future pretty well.

and . . . i know it's not video . . . but .. . .

. . . since someone mentioned it . . .

here's a r600 tidbit:

The New Graphics - A Tale of Direct X 10
From what we hear, the new [nvidia] card might not adopt the Microsoft graphics standard of a true Direct 3D 10 engine with a unified shader architecture. . . . Nvidia engineers to keep a fixed pipeline structure. Why not? They have kept the traditional pattern for all of their cards. It was ATI that deviated and fractured the "pipeline" view of rendering; the advent of the Radeon X1000 introduced the threaded view of instructions and higher concentrations of highly programmable pixel shaders, to accomplish tasks beyond the "traditional" approach to image rendering.

One thing is for sure; ATI is keeping the concept of the fragmented pipeline and should have unified and highly programmable shaders. We have heard about large cards - like ones 12" long that will require new system chassis designs to hold them - and massive power requirements to make them run.

Why shouldn't the new cards with ATI's R600 require 200-250 watts a piece and the equivalent of a 500 W power supply to run in CrossFire or G80 in SLI? We are talking about adding more hardware to handle even greater tasks and functions, like geometry shaders that can take existing data and reuse it for the subsequent frames. More instructions and functions means more demand for dedicated silicon. We should assume that there will need to be a whole set of linkages and caches designed to hold the data from previous frames, as well as other memory interfaces. Why wouldn't we assume that this would require more silicon?

Although we are not in favour of pushing more power to our graphics processors and the massive amounts of memory on the card, we are excited to think about the prospects of more in our games. Currently one can perform Folding at Home on your graphics processor, and soon we will be able to do effects physics on them too

see . . . it was even related

someday there will be an AT CPU/Video forum :p
:Q

:D

sorry
 

atom

Diamond Member
Oct 18, 1999
4,722
0
0
I guess it's nice to see Nvidia taking an aggressive role but it's still going to be a low end part (same as Intel and AMD with their integrated designs). It'll be uphill for nvidia because AMD and Intel already have healthy partnerships with the big manufacturers. I wonder if it's even worthwhile for a company like Dell to support 3 platforms.
 

Nocturnal

Lifer
Jan 8, 2002
18,927
0
76
They started with graphics, succeeded, went into chipsets, succeeded, and now the most logical thing for them to do is step into the processor market. It makes a lot of sense.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Well, AMD has their Hypertransport bus and socket positioned as "open standards", it's possible nvidia could even make drop in replacements for Athlons.
 

Furen

Golden Member
Oct 21, 2004
1,567
0
0
Originally posted by: atom
I guess it's nice to see Nvidia taking an aggressive role but it's still going to be a low end part (same as Intel and AMD with their integrated designs). It'll be uphill for nvidia because AMD and Intel already have healthy partnerships with the big manufacturers. I wonder if it's even worthwhile for a company like Dell to support 3 platforms.

I wonder just how low-end these parts will be. Even a 4 pipeline part running at 6x the clock of current solutions should prove quite powerful (not extremely high-end, of course but more than integrated-graphics level)... then again, even DDR3 will have a hard time feeding the CPU and a GPU. I really do hope AMD and Intel end up throwing programable shader arrays into their CPUs, since even if they are kind of mediocre as graphics cards they could still be used for physics and the like. Better than Aegia's extremely overpriced lame-duck solution.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: Fox5
Well, AMD has their Hypertransport bus and socket positioned as "open standards", it's possible nvidia could even make drop in replacements for Athlons.
2 years ago, both companies would have been open to the idea. Now that AMD has bought nVidia's rival, neither company would go for something like that.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,564
14,520
136
AMD/ATI may take the initial lead on this, but they are behind NV and Intel with regards to chipsets.
I disagree with this ! Iave one AMD and one Intel motherboard BOTY with ATI chipsets, and they do great in overclocking ! I would not want to say who is better, but they have at least good chipsets now.