Attn: josh6079, I know you love it...

Sable

Golden Member
Jan 7, 2006
1,130
105
106
http://www.theinquirer.net/default.aspx?article=35195

WHEN WE GOT first sniff from the Nvidia editors' day, we figured we would delay things a bit.
Some are humourless enough not to see the fun in being fed info directly from things you are not invited to. That said, a few hours is long enough.

The upcoming GF8800 line will hit just under 12,000 in 3DMark06, with a new set of drivers due before the non-denominational year-end festive season, pushing it to just over 12K.

Not quite enough to beat R600 from what I hear though.

No mention of WHICH 8800 though, GTX or GTS.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i found this much more interesting:

Nvidia at work on combined CPU with graphic...On 65nm in 2008
Nvidia's . . . acquir[ed] some folk from Stexar, a company that was known for its X86 marchitectural expertise.

And now we hear that development is underway at Nvidia's just-announced Portland, Oregon, Design Center, where chip folk are beaving away on 45 nanometre designs.

The project should bear fruit sometime in 2008, as Nvidia prepares plans to compete with Intel and AMD on the blended graphic and CPU concept.

This is what OEMs want in 2008. Sixty-five or 45 nanometre processes make this possible and AMD and Intel are going to do it, so Nvidia doesn?t have much choice.
:Q
 

NoStateofMind

Diamond Member
Oct 14, 2005
9,711
6
76
Originally posted by: apoppin
i found this much more interesting:

Nvidia at work on combined CPU with graphic...On 65nm in 2008
Nvidia's . . . acquir[ed] some folk from Stexar, a company that was known for its X86 marchitectural expertise.

And now we hear that development is underway at Nvidia's just-announced Portland, Oregon, Design Center, where chip folk are beaving away on 45 nanometre designs.

The project should bear fruit sometime in 2008, as Nvidia prepares plans to compete with Intel and AMD on the blended graphic and CPU concept.

This is what OEMs want in 2008. Sixty-five or 45 nanometre processes make this possible and AMD and Intel are going to do it, so Nvidia doesn?t have much choice.
:Q


Oh hell yeah thats more interesting. Sh!t, it should have its own thread in the CPU forum :)
 

Sable

Golden Member
Jan 7, 2006
1,130
105
106
Originally posted by: apoppin
i found this much more interesting:

Nvidia at work on combined CPU with graphic...On 65nm in 2008
Nvidia's . . . acquir[ed] some folk from Stexar, a company that was known for its X86 marchitectural expertise.

And now we hear that development is underway at Nvidia's just-announced Portland, Oregon, Design Center, where chip folk are beaving away on 45 nanometre designs.

The project should bear fruit sometime in 2008, as Nvidia prepares plans to compete with Intel and AMD on the blended graphic and CPU concept.

This is what OEMs want in 2008. Sixty-five or 45 nanometre processes make this possible and AMD and Intel are going to do it, so Nvidia doesn?t have much choice.
:Q

They've got a LONG way to go to catch intel and AMD I reckon. Definitely doable though and with CPU design as well they're the whole package thanks to their excellent chipset business.


 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: PC Surgeon
Originally posted by: apoppin
i found this much more interesting:

Nvidia at work on combined CPU with graphic...On 65nm in 2008
Nvidia's . . . acquir[ed] some folk from Stexar, a company that was known for its X86 marchitectural expertise.

And now we hear that development is underway at Nvidia's just-announced Portland, Oregon, Design Center, where chip folk are beaving away on 45 nanometre designs.

The project should bear fruit sometime in 2008, as Nvidia prepares plans to compete with Intel and AMD on the blended graphic and CPU concept.

This is what OEMs want in 2008. Sixty-five or 45 nanometre processes make this possible and AMD and Intel are going to do it, so Nvidia doesn?t have much choice.
:Q


Oh hell yeah thats more interesting. Sh!t, it should have its own thread in the CPU forum :)

thank-you !

and

Done!

Nvidia at work on combined CPU with graphic

this one is a 'bump' . . . and acknowledgement
:)
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
I think if nvidia start to develop its own cpu processor then you will have ATI , Intel ,IBM and AMD tag team to make sure nvidia don't live to see 2010.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: tuteja1986
I think if nvidia start to develop its own cpu processor then you will have ATI , Intel ,IBM and AMD tag team to make sure nvidia don't live to see 2010.

Uh no. Intel just bought a larger stake in NVIDIA. So anything NVIDIA profits from...so does Intel.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Topic Title: Attn: josh6079, I know you love it...
Topic Summary: ...Inq says 12K 3D Marks for GeForce 8800 ;-)
ROTFL!!!! Oh, man Sable that one got me!!

3DMark, schmeedeeMark, I want to see some Oblivion frames with max settings and HDR+4xAA in there.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: Wreckage
Originally posted by: tuteja1986
I think if nvidia start to develop its own cpu processor then you will have ATI , Intel ,IBM and AMD tag team to make sure nvidia don't live to see 2010.

Uh no. Intel just bought a larger stake in NVIDIA. So anything NVIDIA profits from...so does Intel.

They did :( wooo :! damn i didn't read about that... forget my comment if Intel Bought Nvidia shares.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: apoppin
i found this much more interesting:

Nvidia at work on combined CPU with graphic...On 65nm in 2008
Nvidia's . . . acquir[ed] some folk from Stexar, a company that was known for its X86 marchitectural expertise.

And now we hear that development is underway at Nvidia's just-announced Portland, Oregon, Design Center, where chip folk are beaving away on 45 nanometre designs.

The project should bear fruit sometime in 2008, as Nvidia prepares plans to compete with Intel and AMD on the blended graphic and CPU concept.

This is what OEMs want in 2008. Sixty-five or 45 nanometre processes make this possible and AMD and Intel are going to do it, so Nvidia doesn?t have much choice.
:Q


I KNEW IT!!!!!!!!!!!!!!!!!!!! Did I not say this would happen a long time ago??? Muahahahahaha!!! Where's my props?? C'mon, hand em over.!! :D :D

Seriously though, I just had a strong feeling Nvidia would be introducing there own CPU's eventually. Even though this might be a combination chip (CPU/GPU) it still would matter I think. This is pretty big news.

Oh, and not too shabby G80 scores in 3dmark either OP!! Sounds great!
 

BassBomb

Diamond Member
Nov 25, 2005
8,390
1
81
12000? What kind of setup would get that right now (pre-8800 launch)?

Forget 12000 I get 1600!
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: BassBomb
12000? What kind of setup would get that right now (pre-8800 launch)?

Forget 12000 I get 1600!


Crossfired and o/c'd to the ball$ X1900XT's gets pretty close.
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
Originally posted by: BassBomb
12000? What kind of setup would get that right now (pre-8800 launch)?

Forget 12000 I get 1600!

Oced SLI 7900GTX/Crossfired X1900XT with a C2D clocked pretty well. If the 3DMark06 scores are representative of how much faster games will play than I am impressed.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Games are getting a serious kick in the ass next month. G80, DirectX 10, Wii, PS3, Xbox 360 1080p, Blu-ray, etc.

Keep it coming!
 

SpeedZealot369

Platinum Member
Feb 5, 2006
2,778
1
81
Originally posted by: Wreckage
Games are getting a serious kick in the ass next month. G80, DirectX 10, Wii, PS3, Xbox 360 1080p, Blu-ray, etc.

Keep it coming!

Not to mention Gears of War :thumbsup:
 

Pabster

Lifer
Apr 15, 2001
16,986
1
0
Originally posted by: keysplayr2003
Crossfired and o/c'd to the ball$ X1900XT's gets pretty close.

Make that 1950's and yes I cannot even hit 12000. So if a single 8800 does even that I'm sold.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
And whats more amazing is that the INq. doesn't think that'll be enough to best R600. I wonder what they know.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
And whats more amazing is that the INq. doesn't think that'll be enough to best R600. I wonder what they know.
ah ha . . . well let ME update you on what i knew months ago: :p

The New Graphics - A Tale of Direct X 10 [THG

From what we hear, the new [g80] card might not adopt the Microsoft graphics standard of a true Direct 3D 10 engine with a unified shader architecture. Unified shader units can be changed via a command change from a vertex to geometry or pixel shader as the need arises. This allows the graphics processor to put more horsepower where it needs it. We would not put it past Nvidia engineers to keep a fixed pipeline structure. Why not? They have kept the traditional pattern for all of their cards. It was ATI that deviated and fractured the "pipeline" view of rendering; the advent of the Radeon X1000 introduced the threaded view of instructions and higher concentrations of highly programmable pixel shaders, to accomplish tasks beyond the "traditional" approach to image rendering.

One thing is for sure; ATI is keeping the concept of the fragmented pipeline and should have unified and highly programmable shaders. We have heard about large cards - like ones 12" long that will require new system chassis designs to hold them - and massive power requirements to make them run.

Why shouldn't the new cards with ATI's R600 require 200-250 watts a piece and the equivalent of a 500 W power supply to run in CrossFire or G80 in SLI? We are talking about adding more hardware to handle even greater tasks and functions, like geometry shaders that can take existing data and reuse it for the subsequent frames. More instructions and functions means more demand for dedicated silicon. We should assume that there will need to be a whole set of linkages and caches designed to hold the data from previous frames, as well as other memory interfaces. Why wouldn't we assume that this would require more silicon?

Although we are not in favour of pushing more power to our graphics processors and the massive amounts of memory on the card, we are excited to think about the prospects of more in our games. Currently one can perform Folding at Home on your graphics processor, and soon we will be able to do effects physics on them too.[/quote]
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: keysplayr2003
And whats more amazing is that the INq. doesn't think that'll be enough to best R600. I wonder what they know.

I'll be laughing myself silly if that "prediction" doesn't turn out to be true. :laugh:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Nightmare225
Originally posted by: keysplayr2003
And whats more amazing is that the INq. doesn't think that'll be enough to best R600. I wonder what they know.

I'll be laughing myself silly if that "prediction" doesn't turn out to be true. :laugh:

they have a 50-50 chance of being right

maybe r600 will be x1800 all over again :p
 

Schadenfroh

Elite Member
Mar 8, 2003
38,416
4
0
Originally posted by: Wreckage
Originally posted by: tuteja1986
I think if nvidia start to develop its own cpu processor then you will have ATI , Intel ,IBM and AMD tag team to make sure nvidia don't live to see 2010.

Uh no. Intel just bought a larger stake in NVIDIA. So anything NVIDIA profits from...so does Intel.

link?