Whats next from nVidia? G90?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dreddfunk

Senior member
Jun 30, 2005
358
0
0
Originally posted by: Extelleron
I don't think nVidia is going to suddenly release the 8900 series. It's too late at this point for a mere refresh to G80, especially since ATI is going to come out guns blazing (well, I hope) with better cards late this year or early next year. In only 4 months, the 8800 series will be a year old, and judging by what we've seen traditionally in the graphics industry, a new card should be coming.

I'm just really hoping for some sort of 8900GS/2950GT/whatever part that doesn't require a dedicated nuclear power-plant to run, two slots for cooling and more cash than Bill Gates to own.
 

Yanagi

Golden Member
Jun 8, 2004
1,678
0
0
Originally posted by: JasonCoder
<div class="FTQUOTE"><begin quote>Originally posted by: Yanagi
Why would Discrete graphics cards be obsolete just because we have a few extra general purpouse cores in our systems?</end quote></div>

Several reasons actually

<div class="FTQUOTE"><begin quote>Now, if you add in GPU functionality to the cores, not a GPU on the die, but integrated into the x86 pipeline, you have something that can, on a command, eat a GPU for lunch. A very smart game developer told me that with one quarter of the raw power, a CPU can do the same real work as a GPU due to a variety of effects, memory scatter-gather being near the top of that list. The take home message is that a GPU is the king of graphics in todays world, but with the hard left turn Sun and Intel are taking, it will be the third nipple of the chip industry in no time.

Basically, GPUs are a dead end, and Intel is going to ram that home very soon. AMD knows this, ATI knows this, and most likely Nvidia knows this. AMD has to compete, if it doesn't, Intel will leave it in the dust, and the company will die. AMD can develop the talent internally to make that GPU functionality, hunt down all the patents, licensing, and all the minutia, and still start out a year behind Intel. That is if all goes perfectly, and the projects are started tomorrow.</end quote></div>

This is essentially one of the motivations AMD had when purchasing ATI but there's some good info and links to the future of discreet gfx solutions... or lack thereof.

That would be the fusion project and not the quad general purouse cores ala barcelona or kentsfield/nehalem I was referring to.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
The Inq is BS'ing as usual. There's no way Nvidia is releasing a next gen gpu in 07, unless by "next gen" they mean a rehashed g80. What needs to happen first is a die shrink of the g80, possibly Fall 07, and then the g90 may be released in Spring 08.
 

rmed64

Senior member
Feb 4, 2005
237
0
0
Well Nvidia supposedly has a new 9 series midrange coming in spring 08, so yea, the high end probably will release by christmas.

The midrange always seems to be about ~12 months apart from generation to generation.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Just think of it as NV40 to G70. I.e Die shrink, lots of tweaks in architecture, increased number of shaders/TMUs etc which all resulted in 2x performance over the 6800ultra (7800GTX).

G92 is already confirmed to be released later this year. So we will be seeing G80 (90nm) to G92 (65nm). nVIDIA looks like its totally abandoned 80nm process all together.