Whats next from nVidia? G90?

Boney

Member
Aug 10, 2005
102
0
0
I've been scanning the forums lately and all the talk is about the G80 and Ati's new x2900. Sure the x2900 is new but the G80 is a few months old now so what exactly is expceted from them next? I assume it will be monicured the G90 but other than that I really havent heard or read to much about what is to come next.

Here is my little tidbit to add that is all I could find ... from wiki no less.

"The GeForce 9 series, or possible codenames G90 or G92, is a rumored future NVIDIA Graphics Processing Unit. The Inquirer reported that during an analyst webcast, Michael Hara, NVIDIA Vice President of Investor Relations, stated that the G92 will be capable of nearly 1 trillion floating point calculations per second, or 1 TeraFLOPS [1], and therefore be over two times faster than the current GeForce 8800 Ultra. According to the same Inquirer report, Mr. Hara also declared that the G92 is slated to launch during Q4 2007 according to NVIDIA's new product release strategy.[1]

G92 is likely to support DirectX 10.1 and OpenGL 3.0 and G92 will probably be the "second generation" Unified Shader architecture from Nvidia. G92 is likely to be made using the 65nm process technology at TSMC. "G92" product name is likely to be "GeForce 9800 GTX" going by past Nvidia product naming schemes."

2x the performance of the Ultra, we'll see about that.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
I heard a rumour that they were pushing for at least two times ultra performance in dx9 games, and three times ultra performance in dx10 titles. Apparently the lower overhead and greater than linear clock speed boosts thanks to specific nvidia dx10 optimisations will allow this to happen.
 

Lord Banshee

Golden Member
Sep 8, 2004
1,495
0
0
Nvidia has quoted to say the next GPU will be double precision as well. Not sure how this will help in games, maybe ray trace, physics, shadows and other lighting. But it will make the CUDA something Real that R&D can actually look at it for being useful.

http://forums.nvidia.com/index.php?showtopic=36286

Does CUDA support double precision arithmetic?

CUDA supports the C "double" data type. However on GeForce 8 series GPUs, these types will get demoted to 32-bit floats. NVIDIA GPUs supporting double precision in hardware will become available in late 2007.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
I'd like to see them follow the route of the cpu.
Start putting multiple cores on the same die.
A quad 8800Gt die would be amazing.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Modelworks
I'd like to see them follow the route of the cpu.
Start putting multiple cores on the same die.
A quad 8800Gt die would be amazing.

Doh!

GPUs already sport multiple "cores" on the same die. These are known as "Quads".

This is why you can take a faulty 8800GTX core, disable the faulty quad and rebadge it as an 8800GTS.

 

QuantumPion

Diamond Member
Jun 27, 2005
6,010
1
76
I have a quote from a trusted inside industry source (that I acquired because I am so cool and hip and worthy of your advertising dollars) that the Inquirer doesn't know what they are talking about and pulls their stories out of thin air.
 

ForumMaster

Diamond Member
Feb 24, 2005
7,792
1
0
well, frankly i hope they get good performance but take a lesson from Intel and lower the power requirements. the G80 is amazing though. and NVIDIA recently started producing what they call Tesla. amazing stuff.

Tesla Reviews
 

JasonCoder

Golden Member
Feb 23, 2005
1,893
1
81
I thought the whole concept of a discreet graphics board was going to be gone around the time quad cores were in wide circulation.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: JasonCoder
I thought the whole concept of a discreet graphics board was going to be gone around the time quad cores were in wide circulation.

You thought wrong.
 

Yanagi

Golden Member
Jun 8, 2004
1,678
0
0
Why would Discrete graphics cards be obsolete just because we have a few extra general purpouse cores in our systems?
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Hopefully power consumption reduction. The 8800s aren't exactly watt sippers either. :p
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Bateluer
Hopefully power consumption reduction. The 8800s aren't exactly watt sippers either. :p

As long as power consumption stays the SAME as with G80, I'll be happy. The 8800's don't take up much power at all.
 

thilanliyan

Lifer
Jun 21, 2005
12,031
2,243
126
Originally posted by: Extelleron
Originally posted by: Bateluer
Hopefully power consumption reduction. The 8800s aren't exactly watt sippers either. :p

As long as power consumption stays the SAME as with G80, I'll be happy. The 8800's don't take up much power at all.

Compared to what?? :confused:
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: thilan29
<div class="FTQUOTE"><begin quote>Originally posted by: Extelleron
<div class="FTQUOTE"><begin quote>Originally posted by: Bateluer
Hopefully power consumption reduction. The 8800s aren't exactly watt sippers either. :p</end quote></div>

As long as power consumption stays the SAME as with G80, I'll be happy. The 8800's don't take up much power at all.</end quote></div>

Compared to what?? :confused:

Other graphics cards? The 8800GTS takes up less power than the X1900's/X1950's, and the 8800GTX uses a few watts more than the X1950XTX. Overall, you can run the GTS or GTX no problem whatsoever on a decent 450-500W power supply. The HD 2900XT is a bit excessive in its power usage, but the 8800's are fine.
 

Sable

Golden Member
Jan 7, 2006
1,130
105
106
I predict a new card that'll be faster than their previous card. :wtf: :gtfo:
 

JasonCoder

Golden Member
Feb 23, 2005
1,893
1
81
<div class="FTQUOTE"><begin quote>Originally posted by: Yanagi
Why would Discrete graphics cards be obsolete just because we have a few extra general purpouse cores in our systems?</end quote></div>

Several reasons actually

Now, if you add in GPU functionality to the cores, not a GPU on the die, but integrated into the x86 pipeline, you have something that can, on a command, eat a GPU for lunch. A very smart game developer told me that with one quarter of the raw power, a CPU can do the same real work as a GPU due to a variety of effects, memory scatter-gather being near the top of that list. The take home message is that a GPU is the king of graphics in todays world, but with the hard left turn Sun and Intel are taking, it will be the third nipple of the chip industry in no time.

Basically, GPUs are a dead end, and Intel is going to ram that home very soon. AMD knows this, ATI knows this, and most likely Nvidia knows this. AMD has to compete, if it doesn't, Intel will leave it in the dust, and the company will die. AMD can develop the talent internally to make that GPU functionality, hunt down all the patents, licensing, and all the minutia, and still start out a year behind Intel. That is if all goes perfectly, and the projects are started tomorrow.

This is essentially one of the motivations AMD had when purchasing ATI but there's some good info and links to the future of discreet gfx solutions... or lack thereof.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: JasonCoder
<div class="FTQUOTE"><begin quote>Originally posted by: Yanagi
Why would Discrete graphics cards be obsolete just because we have a few extra general purpouse cores in our systems?</end quote></div>

Several reasons actually

<div class="FTQUOTE"><begin quote>Now, if you add in GPU functionality to the cores, not a GPU on the die, but integrated into the x86 pipeline, you have something that can, on a command, eat a GPU for lunch. A very smart game developer told me that with one quarter of the raw power, a CPU can do the same real work as a GPU due to a variety of effects, memory scatter-gather being near the top of that list. The take home message is that a GPU is the king of graphics in todays world, but with the hard left turn Sun and Intel are taking, it will be the third nipple of the chip industry in no time.

Basically, GPUs are a dead end, and Intel is going to ram that home very soon. AMD knows this, ATI knows this, and most likely Nvidia knows this. AMD has to compete, if it doesn't, Intel will leave it in the dust, and the company will die. AMD can develop the talent internally to make that GPU functionality, hunt down all the patents, licensing, and all the minutia, and still start out a year behind Intel. That is if all goes perfectly, and the projects are started tomorrow.</end quote></div>

This is essentially one of the motivations AMD had when purchasing ATI but there's some good info and links to the future of discreet gfx solutions... or lack thereof.

Yes in year 2050.

Ill be too old to care and i bet world war three wouldve happened.. or maybe even alien invasion :D

Let me put it this way. Compare a electronic dedicated to taking pictures to lets say a phone that does mp3, takes pictures and other things. The electronic that takes pictures only will win in almost all the performance test. But the phone can do other things and thats it advantage.

Now looking at the CPU + GPU fusion, its essentially the same concept. This product wont touch a discrete GPUs performance for a long while, and infact i could probably get several peopel to back me up. However in the LONG run, it will be the future.

But the main idea behind this is mainly about feature/cost ratio. It means the motherboard does not require an IGP (i.e theres no need for seperate R&D on IGP solutions) or any sort of discrete GPU. You only need the CPU. Cost in building such systems will be quite cheap as opposed to a system housing a CPU and a discrete GPU etc.

Dont get too excited on earlier products based on this concept. It will be a LONG while before we see such product capable of besting discrete GPUs if at all that is.

 

cm123

Senior member
Jul 3, 2003
489
2
76
Originally posted by: theprodigalrebel
It will definitely need more Jiggawatts!

very right!

maybe if by some fluke the 1gb 2900 give the 8800 run, we'll see the 8900 bit sooner :)

I have to agree too... hate the long card thing, sucks, used to like gaint cases, not anymore...

 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: cm123
<div class="FTQUOTE"><begin quote>Originally posted by: theprodigalrebel
It will definitely need more Jiggawatts!</end quote></div>

very right!

maybe if by some fluke the 1gb 2900 give the 8800 run, we'll see the 8900 bit sooner :)

I have to agree too... hate the long card thing, sucks, used to like gaint cases, not anymore...

I don't think nVidia is going to suddenly release the 8900 series. It's too late at this point for a mere refresh to G80, especially since ATI is going to come out guns blazing (well, I hope) with better cards late this year or early next year. In only 4 months, the 8800 series will be a year old, and judging by what we've seen traditionally in the graphics industry, a new card should be coming.