Geforce 9800 GTX: Picture and specs

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: Sylvanas
Originally posted by: JAG87
Here is another gem for you guys: you wanna know how the 9800 GTX is going to perform? Look at 9600GT SLI results. 64 shades and 32TMUs x 2 = 128 shaders and 64TMUs

Keep in mind that SLI results are usually 1.8x the performance of one card, while the GTX will probably be more than 2x the performance of a 9600GT. Consider that all the clock speeds will be a tiny bit higher, and that there is no SLI scaling issues.

Here is a review: keep an eye on the FEAR results, as FEAR is a game that has always scaled nearly perfect with SLI. Feel free to check out other games as well, but focus on FEAR, because its a good indication of 9800GTX performance.

http://www.firingsquad.com/har..._performance/page3.asp

After having read that, picture that the 9800GX2 is a card with TWO of these new G92 chips on it. It is clocked slightly lower than the GTX, but I would say we can expect 1.7 - 1.8x the performance of a 9800GTX (without SLI/driver issues on the GX2)

I have to be sincere the GX2 is looking like a damn good card, especially for those with Intel chipsets. And if nvidia got Quad SLI to work this time, all I have to say is ZOMG.

Thats a really selective bench as we all know Fear scales VERY well with multi-GPU setups, ultimately the performance of a card cannot be predicted by a few games as each engine is vastly different, have a look at Crysis (which I believe is the benchmark for the 'next generation') from that same review- the results are pathetic here

I'm sick of waiting for a 'REAL' next generation.....Nvidia/ATI should be looking at games like Crysis, (Fallout 3 and other titles this year also look very demanding so it's not just 'one game').


Agree 200%. Just trying to shed light for those people that think the 9800GTX will be a slightly overclocked 8800GTS, but I completely agree with you. The only reason I point out FEAR is because 9600GT SLI results are all we have to predict 9800GTX performance.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: jaredpace
Originally posted by: n7


And I have no interest in Crossfire Combo Cards? or SLI Sandwiches?.

Agree with you on that, for sure!

If only crossfire & sli had a 100% success rate

I don't understand this reasoning, it appears some people just close their ears when it comes to Crossfire/SLI....I have a range of games that WORK and play great on my Crossfire setup (I'm sure apoppin will agree with his experience). I think this harps back to the Vista syndrome in when something works you don't get a flood of people onto forums saying 'everything is fine' as thats expected.....so you end up with a bunch of posts "x doesnt work....i think its SLI/Crossfire" when in most cases the problem can be correctly solved with accurate troubleshooting. Sure you could say 'well you're just saying if its good for you its good for everyone' I understand that, but I have helped build about 3 multi-GPU machines and all have worked fine so :confused:
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: JAG87
Originally posted by: munky
Originally posted by: JAG87
you guys are retarded if you think this card is going to perform like an 8800GT or GTS. the 9600GT has only 64 shaders / 32 TMUs, while the GT has 112 shaders / 52 TMUs, and the GTS has 128 shaders / 64 TMUs

yet the 9600GT is practically on par with the 8800GT and not too far behind the GTS. imagine a card with 128 of the same shaders that the 9600GT has and 64 TMUs. I think we are looking at almost double the performance of an 8800GT. This is not the same core that's in the 8800GT and GTS.

and if this revised G92 core is going into the GX2 as well, boy thats going to be one heck of a card. up to 4 times faster than a stock 8800GT. my only grip is that quad SLI will probably suck donkey balls again, and that 2 GTXs in SLI will end up being the fastest graphics setup again.

anyone recall the 7900 series, sound familiar?


PS. I am still shaking my head at the 256 bit bus and 512 MB memory. all this graphics horsepower will probably get destroyed at high resolutions.

The 8800gt is already bottlenecked by the 256-bit bus, that's why the 9600gt is so close, IMO. The 8800gt is about 27% faster on average without AA, and the lead shrinks to only 15% with AA enabled. Which makes these rumored specs even more ridiculous, seeing how it's just an OC'd 8800gts with an extra SLI connector, if these specs are correct.


I am going to have to disagree with that as well, the 8800GT is practically neck to neck with the 8800GTX, and the the 8800GTX has a 384 bit bus. The 8800GTX only shows its muscle at 1920x1200 with 8xAA or 2560x1600 with 4xAA, otherwise they are totally even. How is it that the 8800GT is bottlenecked again? At super duper high-res, yes I agree with you, but thats not what most people are concerned about. I am, but not the average Joe.

Reading what Sylvanas pointed out, maybe nvidia found a way to make a 256 bit bus suffice and all they did was take G92 and optimize its architecture to make it as efficient as G94. I stand by my "twice as fast as 8800GT/GTS" prediction. I just hope that this compression technology thinga majig combined with a piss poor 256 bit bus doesn't crap out at 2560x1600, because that would be a real shame for a high-end card.

On the good side, the card should be cheaper because 256 bit bus is much cheaper to produce than 384 bit. Considering the GX2 is rumored to cost 599 US, the GTX should end up around 499-549 US, which is below the 8800GTX launch price of 599 US.

Again G92 does 8textures per clock while G80 does 4textures per clock. 8800gt gets close without AA when memory bandwidth isn't bottlenecking like lower resolutions without AA where it can flex it's texturing muscles. If it had more memory bandwidth it would easily take on gtx.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: Sylvanas
Originally posted by: jaredpace
Originally posted by: n7


And I have no interest in Crossfire Combo Cards? or SLI Sandwiches?.

Agree with you on that, for sure!

If only crossfire & sli had a 100% success rate

I don't understand this reasoning, it appears some people just close their ears when it comes to Crossfire/SLI....I have a range of games that WORK and play great on my Crossfire setup (I'm sure apoppin will agree with his experience). I think this harps back to the Vista syndrome in when something works you don't get a flood of people onto forums saying 'everything is fine' as thats expected.....so you end up with a bunch of posts "x doesnt work....i think its SLI/Crossfire" when in most cases the problem can be correctly solved with accurate troubleshooting. Sure you could say 'well you're just saying if its good for you its good for everyone' I understand that, but I have helped build about 3 multi-GPU machines and all have worked fine so :confused:



The reason multi GPU on a stick is bashed so much, is because it usually takes 2-3 months from the date of purchase to actually unleash the full potential of the card. If the 7950GX2 and its drivers are anything to go by, the first few months of the 9800GX2 existence will see it underperform the 9800GTX. But I cant predict the future, nvidia might get it right this time.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: JAG87
Originally posted by: Sylvanas
Originally posted by: jaredpace
Originally posted by: n7


And I have no interest in Crossfire Combo Cards? or SLI Sandwiches?.

Agree with you on that, for sure!

If only crossfire & sli had a 100% success rate

I don't understand this reasoning, it appears some people just close their ears when it comes to Crossfire/SLI....I have a range of games that WORK and play great on my Crossfire setup (I'm sure apoppin will agree with his experience). I think this harps back to the Vista syndrome in when something works you don't get a flood of people onto forums saying 'everything is fine' as thats expected.....so you end up with a bunch of posts "x doesnt work....i think its SLI/Crossfire" when in most cases the problem can be correctly solved with accurate troubleshooting. Sure you could say 'well you're just saying if its good for you its good for everyone' I understand that, but I have helped build about 3 multi-GPU machines and all have worked fine so :confused:



The reason multi GPU on a stick is bashed so much, is because it usually takes 2-3 months from the date of purchase to actually unleash the full potential of the card. If the 7950GX2 and its drivers are anything to go by, the first few months of the 9800GX2 existence will see it underperform the 9800GTX. But I cant predict the future, nvidia might get it right this time.

Indeed, and I think by the looks of things it would be in their best interests to do so. ATI now has mixed Crossfire and a very good dual GPU card +drivers. Nvidia should definitely be focusing on building a better dual GPU driver to avoid another GX2. Needless to say I expect quite a bit from them this time round.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Sylvanas
Originally posted by: jaredpace
Originally posted by: n7


And I have no interest in Crossfire Combo Cards? or SLI Sandwiches?.

Agree with you on that, for sure!

If only crossfire & sli had a 100% success rate

I don't understand this reasoning, it appears some people just close their ears when it comes to Crossfire/SLI....I have a range of games that WORK and play great on my Crossfire setup (I'm sure apoppin will agree with his experience). I think this harps back to the Vista syndrome in when something works you don't get a flood of people onto forums saying 'everything is fine' as thats expected.....so you end up with a bunch of posts "x doesnt work....i think its SLI/Crossfire" when in most cases the problem can be correctly solved with accurate troubleshooting. Sure you could say 'well you're just saying if its good for you its good for everyone' I understand that, but I have helped build about 3 multi-GPU machines and all have worked fine so :confused:

absolutely

and QUIT bringing Crysis up as an example .. it is NOT optimized {PERIOD}

Do not expect Crysis to be a reliable benchmark for another year - at least with Vista and multi-GPU rigs ... look how BS FarCry was for its first year
- it is a buggy game ... or try to play the last chapter again and immagine it runs as well as the first half.

Until the CryTek devs *fix* it and optimize it, nvidia and AMD will be accused of 'cheating' as they do random hotfixes and patches as reported and as necessary.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: apoppin
Originally posted by: Sylvanas
Originally posted by: jaredpace
Originally posted by: n7


And I have no interest in Crossfire Combo Cards? or SLI Sandwiches?.

Agree with you on that, for sure!

If only crossfire & sli had a 100% success rate

I don't understand this reasoning, it appears some people just close their ears when it comes to Crossfire/SLI....I have a range of games that WORK and play great on my Crossfire setup (I'm sure apoppin will agree with his experience). I think this harps back to the Vista syndrome in when something works you don't get a flood of people onto forums saying 'everything is fine' as thats expected.....so you end up with a bunch of posts "x doesnt work....i think its SLI/Crossfire" when in most cases the problem can be correctly solved with accurate troubleshooting. Sure you could say 'well you're just saying if its good for you its good for everyone' I understand that, but I have helped build about 3 multi-GPU machines and all have worked fine so :confused:

absolutely

and QUIT bringing Crysis up as an example .. it is NOT optimized {PERIOD}

Do not expect Crysis to be a reliable benchmark for another year - at least with Vista and multi-GPU rigs ... look how BS FarCry was for its first year
- it is a buggy game ... or try to play the last chapter again and immagine it runs as well as the first half.

Until the CryTek devs *fix* it and optimize it, nvidia and AMD will be accused of 'cheating' as they do random hotfixes and patches as reported and as necessary.

There are plenty of examples of upcoming games which are DX10 and should bring any rig to its knees- which is what these 'next gen' cards will be judged upon. I was merely highlighting one thats out NOW, as I said earlier Fallout 3- then we have 08 games like Clear sky, Age of Conan, Alan Wake, Rainbow 6 Vegas 2 etc.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: JAG87
Here are some more tidbits to ponder on:

http://www.anandtech.com/video/showdoc.aspx?i=3234

9600GT has 505M transistors, while the GTS 512 has 754M. If the architecture of G94 was just a cut down G92, shouldn't the number be a lot lower? Moreover, take G94 and double it, voila' theres your 1 BILLION transistor card.

And finally, a juicy screenshot confirming that the information is correct: http://en.expreview.com/?p=277

It doesn't work like that. Just because a GPU has 1/2 the number of execution units doesn't mean it has half the number of transistors, SPs are only one part of the GPU.

An example of this would be HD 2900 vs HD 2600; 2900 has 320SP vs 120SP for the 2600, less than half... yet R600 is 700M transistors while RV630 is 390M, more than half.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: JAG87
Here are some more tidbits to ponder on:

http://www.anandtech.com/video/showdoc.aspx?i=3234

9600GT has 505M transistors, while the GTS 512 has 754M. If the architecture of G94 was just a cut down G92, shouldn't the number be a lot lower? Moreover, take G94 and double it, voila' theres your 1 BILLION transistor card.

And finally, a juicy screenshot confirming that the information is correct: http://en.expreview.com/?p=277

nice find, Jag! What is the die size of a 8800gt/gts? Is it 330mm2 like in this 9800gtx gpuz screenie?


Edit: check out the comment at expreview:
1. Anon Says:
February 26th, 2008 at 10:39 am

But in comparison to an 8800 GTX, it has fewer ROPs (16 vs 24), Still only DX 10.0 Support, smaller bus (256bit vs 384Bit), less RAM (512mb vs 768mb), less bandwidth (70.4GB/sec vs 86.4GB/sec).

But has higher clocks 675mhz vs 575mhz GPU, 1100mhz vs 900mhz Mem, 1688mhz vs 1350mhz Shader.

Really doesn?t seem like a great jump to me, but specs aside i wonder how it runs.


It's similar to what i posted here:
http://forums.anandtech.com/me...AR_FORUMVIEWTMP=Linear

whoa whoa whoa.... Just a shot in the dark here... BUT

Perhaps nvidia is taking the g92 architecture further. Why would they decrease the rops/tmus/memory size/memory bandwidth/etc and provide only a higher clock speed. I bet this g92 style die is capable of 32 or 48 rops, 1024 or 2048mb memory, 384 or 512-bit mem bus, and many other "limited" options. Wonder if they plan to sit on this til the next process drop and unleash a fully capable g92 at 45nm or 32nm. Anybody?
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: Extelleron
Originally posted by: JAG87
Here are some more tidbits to ponder on:

http://www.anandtech.com/video/showdoc.aspx?i=3234

9600GT has 505M transistors, while the GTS 512 has 754M. If the architecture of G94 was just a cut down G92, shouldn't the number be a lot lower? Moreover, take G94 and double it, voila' theres your 1 BILLION transistor card.

And finally, a juicy screenshot confirming that the information is correct: http://en.expreview.com/?p=277

It doesn't work like that. Just because a GPU has 1/2 the number of execution units doesn't mean it has half the number of transistors, SPs are only one part of the GPU.

An example of this would be HD 2900 vs HD 2600; 2900 has 320SP vs 120SP for the 2600, less than half... yet R600 is 700M transistors while RV630 is 390M, more than half.



Everything in the 9600GT is cut down in half from the GTS 512, just look at the specs. I dont see the need for an extra 128M transistors if the architecture is the same.



Originally posted by: jaredpace
nice find, Jag! What is the die size of a 8800gt/gts? Is it 330mm2 like in this 9800gtx gpuz screenie?


The die size reads exactly the same along with a bunch of other stuff. I think the author just made a few modifications necessary for the app to read the proper name and the proper clocks, it even says in the post that W1zzard did a quick modification. Plus that is data that is not read from the card itself, it is just pre-programmed into the app.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
So does this mean that the 8800gt/gts 512 die and the 9800gtx dies are different size/versions of g92? or just that noone knows for sure yet?

also intresting note about the 1billion transistor quote as that would put it back on par with the "uber high" speculation from 10 months back. but the 1billion transistor spec sheet also said 512bit memory interface, 1024mb ram, and 24-48 rops. Wonder if all that is true as well?

I have one foot in Jag87's yard now :)
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: jaredpace
So does this mean that the 8800gt/gts 512 die and the 9800gtx dies are different size/versions of g92? or just that noone knows for sure yet?

also intresting note about the 1billion transistor quote as that would put it back on par with the "uber high" speculation from 10 months back. but the 1billion transistor spec sheet also said 512bit memory interface, 1024mb ram, and 24-48 rops. Wonder if all that is true as well?

I have one foot in Jag87's yard now :)



Nobody knows what the die looks like because nobody has taken the heatsink off of it yet. But expect those pics real soon, probably in the next few days. Here look, they starting to take it apart: http://en.expreview.com/?p=276

I personally do not think this card will break 1 Billion transistors, but it will definitely be higher than 754M which is the current G92 count. I am starting to get excited about the GX2, 4x the performance of a stock 8800GT sounds killer. God I hope Quad SLI works, oh peeelllleeas god make it happen.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: Pocatello
ATi needs to release something good to get over this stagnation.

Whats good? The 3870X2 is the best performing single card solution and the 3870 and 3850 are competitively priced in their respective segments with the 70 being on par with a 9600GT and the 3850 blowing away the 8600GTS....
 

sgrinavi

Diamond Member
Jul 31, 2007
4,537
0
76
Originally posted by: JAG87
Originally posted by: sgrinavi
Originally posted by: JAG87
you guys are retarded if you think this card is going to perform like an 8800GT or GTS.


Don't hold back, say what you really mean



lol? im not sure what you mean

Yes, LOL, your comment just struck me funny
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Sylvanas
Originally posted by: Pocatello
ATi needs to release something good to get over this stagnation.

Whats good? The 3870X2 is the best performing single card solution and the 3870 and 3850 are competitively priced in their respective segments with the 70 being on par with a 9600GT and the 3850 blowing away the 8600GTS....

let me try it ... AMD should release a GPU that blows away nvidia's top card
:Q

like r300 dominated ... like g80

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Sylvanas
Originally posted by: Pocatello
ATi needs to release something good to get over this stagnation.

Whats good? The 3870X2 is the best performing single card solution and the 3870 and 3850 are competitively priced in their respective segments with the 70 being on par with a 9600GT and the 3850 blowing away the 8600GTS....

Actually i wouldnt call the X2 the best performing single card solution. Its not consistent enough. I just read a review from hothardware
and you can see how competitive a GTX still is. Alot of other hardware sites do back this up. No consistency.

Plus in its price range there are alot of competitors. 9600GT SLi for cheaper. Maybe even 8800GT SLi. Now lets go down the market segment.

No competiton to the 8800GTS 512mb.
No competition to the 8800GT.
Alot of pressure in the mid range with 9600GT (~$179)/HD3870/8800GS. HD3850 is the loser of the bunch because its hovering around the $149~189 mark with the 512mb versions priced in 9600GT/HD3870 territory.
Based off newegg 8600GTS costs around $139~159. (With some outliers of course)

Id say theres ALOT of market segment, especially toward the high/mid end thats been lacking in competition from ATI. Something hardware enthusiasts DONT like. ATi does have some wonderful low end offerings which even nVIDIA admits, but its time they reverse the table.




 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
Hmm.. If 9800 GTX core has more than 754M of transistors, I'd think NV would call it something other than G92-XXX.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: JAG87
you guys are retarded if you think this card is going to perform like an 8800GT or GTS. the 9600GT has only 64 shaders / 32 TMUs, while the GT has 112 shaders / 52 TMUs, and the GTS has 128 shaders / 64 TMUs

yet the 9600GT is practically on par with the 8800GT and not too far behind the GTS. imagine a card with 128 of the same shaders that the 9600GT has and 64 TMUs. I think we are looking at almost double the performance of an 8800GT. This is not the same core that's in the 8800GT and GTS.

and if this revised G92 core is going into the GX2 as well, boy thats going to be one heck of a card. up to 4 times faster than a stock 8800GT. my only grip is that quad SLI will probably suck donkey balls again, and that 2 GTXs in SLI will end up being the fastest graphics setup again.

anyone recall the 7900 series, sound familiar?


PS. I am still shaking my head at the 256 bit bus and 512 MB memory. all this graphics horsepower will probably get destroyed at high resolutions.


go read a few other threads before throwing out words like "you guys are retarded if..." YOU are retarded if you think that a g92 card with nearly identical specs to 8800gts 512 is going to magically destroy an 8800gtx. It will beat an 8800gtx at most resolutions in most situations, but at high resolution with AA/AF (the only reason to buy any of these over an 8800gt), the 8800gtx and ultra will still kick its ass.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: n7
This is utter bullshit.

256 bit?
512 MB?

WTF kinda downgrading garbage is this?! :|

1.5 yrs. after the 8800 GTX we get a bandwidth crippled piece of crap?

I'm more than unhappy if this is really true.

Sure, for the average Joe with his 22" monitor (no offence to those with 22"ers :p), this is great, but some of us run large displays & like AA.

This is like the biggest joke ever. :frown:

Unbelievable.

well, at least your 8800gtx was a good investment...

 

CurseTheSky

Diamond Member
Oct 21, 2006
5,401
2
0
Heh, when I shelled out the $600 for "cutting-edge" performance, I was convinced I'd be regretting it in six months or less. That was November 2006.

It makes a pretty good heater, too...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: CurseTheSky
Heh, when I shelled out the $600 for "cutting-edge" performance, I was convinced I'd be regretting it in six months or less. That was November 2006.

It makes a pretty good heater, too...

My heater makes a decent mid-range GPU :p
... but i got mine in May for about half yours
-i am also surprised that at this rate, i will still probably have it this May as part of Xfire.

But ... weren't you *bitterly complaining* last November '06? ... .... something about the "drivers" ?
:Q

... or was that another one of a hundred other complaining g80 owners back then?


:D

 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Yeah i remember posting about how 8800gtx owners were lucky. (this was back in november of last year) I couldn't believe that nvidia kept the crown for over a year - this was dawning the release of the 8800gts, still ultra wins, then 3870X2, and still ultra can win, then 9800gtx, and wtf come march 11th?

I still say 8800gtx owners are lucky. how long did r300 rock the top spots? was it over 1.5years?