Geforce 9800 GTX: Picture and specs

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Rusin

Senior member
Jun 25, 2007
573
0
0
Well every source seem to now say that this card would have 675MHz for core...whopping +25MHz =/
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Piuc2020
If it's multicore... what the hell is nvidia's motivation to release a 9800GX2 card... to be followed by an almost identical dual 9800 card in less than 2 weeks... come on guys, NV is smarter than this and so should you instead of spreading the FUD like it's peanut butter.

Unless of course there is actually a reason why nvidia would choose to release almost identically performing multi-gpu cards less than 2 weeks apart from each other with different names...



Not trying to spread FUD Piuc, just some observations and things to discuss. That's really all there is to it. I made it plain and clear that I do not know anything at all for a fact about 9800GTX. It's just little clues that "could" be indicative of something, a little different. And you might detect a smidge of hope in my typing. Well who wouldn't? What a let down a 9800GTX would be if it were just an overclocked 8800 GTS 512, don't you think?

And besides, as long as we know that nobody is DECLARING this a multicore card, and only discussing possibilities, it is harmless. That is part of what this forum is. Interesting discussions about technology.

You don't find anything odd about the 9800GTX? No questions? I mean, if you don't, that is perfectly fine. It's just that a few other people in here have noticed some peculiarities and commented as well.
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
What a dissapointment if it comes with 256-bit 512mb of memory! I'm inclined not to believe these specs. But who the hell knows. Keys, Jag, you guys make some interesting points.

nV, can I place a special order? I'd like 1 million+ transistors, 512-bit 1GB DDR4, and mad clocks that require Peltier/water cooling. Please provide quote and lead time...
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: JAG87
you guys are retarded if you think this card is going to perform like an 8800GT or GTS. the 9600GT has only 64 shaders / 32 TMUs, while the GT has 112 shaders / 52 TMUs, and the GTS has 128 shaders / 64 TMUs

yet the 9600GT is practically on par with the 8800GT and not too far behind the GTS. imagine a card with 128 of the same shaders that the 9600GT has and 64 TMUs. I think we are looking at almost double the performance of an 8800GT. This is not the same core that's in the 8800GT and GTS.

and if this revised G92 core is going into the GX2 as well, boy thats going to be one heck of a card. up to 4 times faster than a stock 8800GT. my only grip is that quad SLI will probably suck donkey balls again, and that 2 GTXs in SLI will end up being the fastest graphics setup again.

anyone recall the 7900 series, sound familiar?


PS. I am still shaking my head at the 256 bit bus and 512 MB memory. all this graphics horsepower will probably get destroyed at high resolutions.

The 8800gt is already bottlenecked by the 256-bit bus, that's why the 9600gt is so close, IMO. The 8800gt is about 27% faster on average without AA, and the lead shrinks to only 15% with AA enabled. Which makes these rumored specs even more ridiculous, seeing how it's just an OC'd 8800gts with an extra SLI connector, if these specs are correct.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Why doesn't Nvidia move to GDDR4 already?

That card is 50mhz faster in the shader and nothing else.. I hope they have some uber tweaks or some magic drivers.
 

krnmastersgt

Platinum Member
Jan 10, 2008
2,873
0
0
I'm almost positive they did with the 9800 GTX, but in the case they didn't the GF 9 is going to be an incredibly disappointing series.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: Azn
Why doesn't Nvidia move to GDDR4 already?

That card is 50mhz faster in the shader and nothing else.. I hope they have some uber tweaks or some magic drivers.
Because they still have this plan: GDDR3 -> GDDR5 directly. Expect G100-highend cards to be first GDDR5-products from Nvidia [Late Q2 / early Q3]
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
about the 512MB being too small, there is no indication of it thus far, furthermore, the GF9 has already shown it takes a lot less vram for AA and AF. which means less vram is needed. It would be nice if they had a 1GB version though for people who use them on massive displays or want a bit of future proofing. (although I am normally against future proofing, the option should be there).

They are finally releasing a card OTHER then the 8800GTX that has enough plugs for tri SLI... This should kill the price on those overpriced 8800GTX cards that are still being sold here and there. And finalize the 65nm move.

I expect jag's prediction of increased performance to be true. There are cases where more shader/core clock = more performance, and where the opposite is true, it all depends on the implementation and architecture, they are constantly optimizing the architecture and that can give great improvements to performance.
The revised 8800 series card were basically incomplete GF9 cards with the unfinished features disabled and with little to no optimization. They were stop gap releases.

That, or nvidia is really screwing with us and its gonna bomb. we shall see
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: jaredpace
I hope all these damn specification rumors & speculation are false/fake. Otherwise the r700 better kick serious ass. Otherwise there is no competition. This really sucks. the 9xxx series is looking like a launch on the opposite end of the spectrum when compared to R300 & g80. Its more like a Geforce FX launch.

There is competition, if you want to spend top $$ then the 3870X2 is the best on the market....if you want to spend a little less then its the GT....a little less and its 3870/9600GT a little less and its 3850....theres plenty to fill different market segments at the moment.

Secondly the reason the 9600GT does so well, AT hinted at it here:

"Their compression technology has evolved to provide higher effective bandwidth between the GPU and framebuffer. We would love to provide more details on this and the other changes, but NVIDIA is still being a bit tight lipped." from here

Maybe the 9800GTX will perform similar to two 9600GT's in SLI...maybe not we don't know.

 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: Rusin
Originally posted by: Azn
Why doesn't Nvidia move to GDDR4 already?

That card is 50mhz faster in the shader and nothing else.. I hope they have some uber tweaks or some magic drivers.
Because they still have this plan: GDDR3 -> GDDR5 directly. Expect G100-highend cards to be first GDDR5-products from Nvidia [Late Q2 / early Q3]

Source?
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: munky
Originally posted by: JAG87
you guys are retarded if you think this card is going to perform like an 8800GT or GTS. the 9600GT has only 64 shaders / 32 TMUs, while the GT has 112 shaders / 52 TMUs, and the GTS has 128 shaders / 64 TMUs

yet the 9600GT is practically on par with the 8800GT and not too far behind the GTS. imagine a card with 128 of the same shaders that the 9600GT has and 64 TMUs. I think we are looking at almost double the performance of an 8800GT. This is not the same core that's in the 8800GT and GTS.

and if this revised G92 core is going into the GX2 as well, boy thats going to be one heck of a card. up to 4 times faster than a stock 8800GT. my only grip is that quad SLI will probably suck donkey balls again, and that 2 GTXs in SLI will end up being the fastest graphics setup again.

anyone recall the 7900 series, sound familiar?


PS. I am still shaking my head at the 256 bit bus and 512 MB memory. all this graphics horsepower will probably get destroyed at high resolutions.

The 8800gt is already bottlenecked by the 256-bit bus, that's why the 9600gt is so close, IMO. The 8800gt is about 27% faster on average without AA, and the lead shrinks to only 15% with AA enabled. Which makes these rumored specs even more ridiculous, seeing how it's just an OC'd 8800gts with an extra SLI connector, if these specs are correct.


I am going to have to disagree with that as well, the 8800GT is practically neck to neck with the 8800GTX, and the the 8800GTX has a 384 bit bus. The 8800GTX only shows its muscle at 1920x1200 with 8xAA or 2560x1600 with 4xAA, otherwise they are totally even. How is it that the 8800GT is bottlenecked again? At super duper high-res, yes I agree with you, but thats not what most people are concerned about. I am, but not the average Joe.

Reading what Sylvanas pointed out, maybe nvidia found a way to make a 256 bit bus suffice and all they did was take G92 and optimize its architecture to make it as efficient as G94. I stand by my "twice as fast as 8800GT/GTS" prediction. I just hope that this compression technology thinga majig combined with a piss poor 256 bit bus doesn't crap out at 2560x1600, because that would be a real shame for a high-end card.

On the good side, the card should be cheaper because 256 bit bus is much cheaper to produce than 384 bit. Considering the GX2 is rumored to cost 599 US, the GTX should end up around 499-549 US, which is below the 8800GTX launch price of 599 US.
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
This is utter bullshit.

256 bit?
512 MB?

WTF kinda downgrading garbage is this?! :|

1.5 yrs. after the 8800 GTX we get a bandwidth crippled piece of crap?

I'm more than unhappy if this is really true.

Sure, for the average Joe with his 22" monitor (no offence to those with 22"ers :p), this is great, but some of us run large displays & like AA.

This is like the biggest joke ever. :frown:

Unbelievable.

 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: JAG87
Originally posted by: munky
Originally posted by: JAG87
you guys are retarded if you think this card is going to perform like an 8800GT or GTS. the 9600GT has only 64 shaders / 32 TMUs, while the GT has 112 shaders / 52 TMUs, and the GTS has 128 shaders / 64 TMUs

yet the 9600GT is practically on par with the 8800GT and not too far behind the GTS. imagine a card with 128 of the same shaders that the 9600GT has and 64 TMUs. I think we are looking at almost double the performance of an 8800GT. This is not the same core that's in the 8800GT and GTS.

and if this revised G92 core is going into the GX2 as well, boy thats going to be one heck of a card. up to 4 times faster than a stock 8800GT. my only grip is that quad SLI will probably suck donkey balls again, and that 2 GTXs in SLI will end up being the fastest graphics setup again.

anyone recall the 7900 series, sound familiar?


PS. I am still shaking my head at the 256 bit bus and 512 MB memory. all this graphics horsepower will probably get destroyed at high resolutions.

The 8800gt is already bottlenecked by the 256-bit bus, that's why the 9600gt is so close, IMO. The 8800gt is about 27% faster on average without AA, and the lead shrinks to only 15% with AA enabled. Which makes these rumored specs even more ridiculous, seeing how it's just an OC'd 8800gts with an extra SLI connector, if these specs are correct.


I am going to have to disagree with that as well, the 8800GT is practically neck to neck with the 8800GTX, and the the 8800GTX has a 384 bit bus. The 8800GTX only shows its muscle at 1920x1200 with 8xAA or 2560x1600 with 4xAA, otherwise they are totally even. How is it that the 8800GT is bottlenecked again? At super duper high-res, yes I agree with you, but thats not what most people are concerned about. I am, but not the average Joe.

Reading what Sylvanas pointed out, maybe nvidia found a way to make a 256 bit bus suffice and all they did was take G92 and optimize its architecture to make it as efficient as G94. I stand by my "twice as fast as 8800GT/GTS" prediction. I just hope that this compression technology thinga majig combined with a piss poor 256 bit bus doesn't crap out at 2560x1600, because that would be a real shame for a high-end card.

On the good side, the card should be cheaper because 256 bit bus is much cheaper to produce than 384 bit. Considering the GX2 is rumored to cost 599 US, the GTX should end up around 499-549 US, which is below the 8800GTX launch price of 599 US.


That's how it should be. was there negative speculation at the launch of the g80?
 

HOOfan 1

Platinum Member
Sep 2, 2007
2,337
15
81
Originally posted by: n7
This is utter bullshit.

256 bit?
512 MB?

WTF kinda downgrading garbage is this?! :|

1.5 yrs. after the 8800 GTX we get a bandwidth crippled piece of crap?

I'm more than unhappy if this is really true.

Sure, for the average Joe with his 22" monitor (no offence to those with 22"ers :p), this is great, but some of us run large displays & like AA.

This is like the biggest joke ever. :frown:

Unbelievable.

256bit memory interface is very very dissappointing.

The biggest dissappointment is probably that this will probably push back the REAL nextgen card by another 6 months or a year.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: n7
This is utter bullshit.

256 bit?
512 MB?

WTF kinda downgrading garbage is this?! :|

1.5 yrs. after the 8800 GTX we get a bandwidth crippled piece of crap?

I'm more than unhappy if this is really true.

Sure, for the average Joe with his 22" monitor (no offence to those with 22"ers :p), this is great, but some of us run large displays & like AA.

This is like the biggest joke ever. :frown:

Unbelievable.

personally ... and this is just me ... i believe nvidia is attempting MISDIRECTION at the EXACT SAME TIME r700 is taping out ...
... evidently that was over a month ago .. so they would be finalizing 'clocks'
If nvidia's ploy works, AMD will underestimate them and DOWNclock r700

---figure it out :p

... and ...

... AMD already knows ;)
-they are much smarter and more devious than ATi ever was
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Eh, i seriously don't believe they're pulling this kinda crap.

I thought AMD was far from ever taking the performance crown back (X2 doesn't count IMO), but wow, if this is the kinda BS nVidia wants to pull, they won't see any of my money.

I realize this card will fly @ lower resolutions, but flagship cards aren't supposed to fly @ low resolutions, they're supposed to dominate for high end setups.

There is simply no possible way for this not to choke @ 2560x1600, since 256 bit + 512 MB is just not enough.
No amount of magical tweaking is going to fix that.

I gotta say i am beyond belief...truely remarkable how retarded this is.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Here is another gem for you guys: you wanna know how the 9800 GTX is going to perform? Look at 9600GT SLI results. 64 shades and 32TMUs x 2 = 128 shaders and 64TMUs

Keep in mind that SLI results are usually 1.8x the performance of one card, while the GTX will probably be more than 2x the performance of a 9600GT. Consider that all the clock speeds will be a tiny bit higher, and that there is no SLI scaling issues.

Here is a review: keep an eye on the FEAR results, as FEAR is a game that has always scaled nearly perfect with SLI. Feel free to check out other games as well, but focus on FEAR, because its a good indication of 9800GTX performance.

http://www.firingsquad.com/har..._performance/page3.asp

After having read that, picture that the 9800GX2 is a card with TWO of these new G92 chips on it. It is clocked slightly lower than the GTX, but I would say we can expect 1.7 - 1.8x the performance of a 9800GTX (without SLI/driver issues on the GX2)

I have to be sincere the GX2 is looking like a damn good card, especially for those with Intel chipsets. And if nvidia got Quad SLI to work this time, all I have to say is ZOMG.



PS. as you can see, my estimate of 2x 8800GT performance are quite on target
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: JAG87
Here is another gem for you guys: you wanna know how the 9800 GTX is going to perform? Look at 9600GT SLI results. 64 shades and 32TMUs x 2 = 128 shaders and 64TMUs

Keep in mind that SLI results are usually 1.8x the performance of one card, while the GTX will probably be more than 2x the performance of a 9600GT. Consider that all the clock speeds will be a tiny bit higher, and that there is no SLI scaling issues.

Here is a review: keep an eye on the FEAR results, as FEAR is a game that has always scaled nearly perfect with SLI. Feel free to check out other games as well, but focus on FEAR, because its a good indication of 9800GTX performance.

http://www.firingsquad.com/har..._performance/page3.asp

After having read that, picture that the 9800GX2 is a card with TWO of these new G92 chips on it. It is clocked slightly lower than the GTX, but I would say we can expect 1.7 - 1.8x the performance of a 9800GTX (without SLI/driver issues on the GX2)

I have to be sincere the GX2 is looking like a damn good card, especially for those with Intel chipsets. And if nvidia got Quad SLI to work this time, all I have to say is ZOMG.



PS. as you can see, my estimate of 2x 8800GT performance are quite on target



If thats the case, It should perform just like an 8800ultra in crysis at 1920x1200
http://www.firingsquad.com/har..._performance/page9.asp
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: n7
Eh, i seriously don't believe they're pulling this kinda crap.

I thought AMD was far from ever taking the performance crown back (X2 doesn't count IMO), but wow, if this is the kinda BS nVidia wants to pull, they won't see any of my money.

I realize this card will fly @ lower resolutions, but flagship cards aren't supposed to fly @ low resolutions, they're supposed to dominate for high end setups.

There is simply no possible way for this not to choke @ 2560x1600, since 256 bit + 512 MB is just not enough.
No amount of magical tweaking is going to fix that.

I gotta say i am beyond belief...truely remarkable how retarded this is.



I got one leg in your lawn, and one leg in mine (read above post). I truly hope I dont trip and fall in your lawn, cause then there will be 2 raging 3007ers in the video section of the AT forums

:laugh:
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Well, i dearly hope you're right, but even if it does performance like the 9600GT SLI, i'm still very disappointed.

And I have no interest in Crossfire Combo Cards? or SLI Sandwiches?.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: n7
Eh, i seriously don't believe they're pulling this kinda crap.

I thought AMD was far from ever taking the performance crown back (X2 doesn't count IMO), but wow, if this is the kinda BS nVidia wants to pull, they won't see any of my money.

I realize this card will fly @ lower resolutions, but flagship cards aren't supposed to fly @ low resolutions, they're supposed to dominate for high end setups.

There is simply no possible way for this not to choke @ 2560x1600, since 256 bit + 512 MB is just not enough.
No amount of magical tweaking is going to fix that.

I gotta say i am beyond belief...truely remarkable how retarded this is.

i don't think we are seeing nvidia's top card in the leaked 'previews'
--that is all ;)

i tend to over-dramatize things anyway
:confused:

 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: jaredpace
Originally posted by: JAG87
Here is another gem for you guys: you wanna know how the 9800 GTX is going to perform? Look at 9600GT SLI results. 64 shades and 32TMUs x 2 = 128 shaders and 64TMUs

Keep in mind that SLI results are usually 1.8x the performance of one card, while the GTX will probably be more than 2x the performance of a 9600GT. Consider that all the clock speeds will be a tiny bit higher, and that there is no SLI scaling issues.

Here is a review: keep an eye on the FEAR results, as FEAR is a game that has always scaled nearly perfect with SLI. Feel free to check out other games as well, but focus on FEAR, because its a good indication of 9800GTX performance.

http://www.firingsquad.com/har..._performance/page3.asp

After having read that, picture that the 9800GX2 is a card with TWO of these new G92 chips on it. It is clocked slightly lower than the GTX, but I would say we can expect 1.7 - 1.8x the performance of a 9800GTX (without SLI/driver issues on the GX2)

I have to be sincere the GX2 is looking like a damn good card, especially for those with Intel chipsets. And if nvidia got Quad SLI to work this time, all I have to say is ZOMG.



PS. as you can see, my estimate of 2x 8800GT performance are quite on target



If thats the case, It should perform just like an 8800ultra in crysis at 1920x1200
http://www.firingsquad.com/har..._performance/page9.asp


Yea but Crysis is a horrible game to benchmark. It doesn't scale well with SLI, and especially in DX10 its a bit of a mixed bag with results. Look at HL2 as well, thats another horrible game to benchmark, completely CPU limited in this case.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: JAG87
Here is another gem for you guys: you wanna know how the 9800 GTX is going to perform? Look at 9600GT SLI results. 64 shades and 32TMUs x 2 = 128 shaders and 64TMUs

Keep in mind that SLI results are usually 1.8x the performance of one card, while the GTX will probably be more than 2x the performance of a 9600GT. Consider that all the clock speeds will be a tiny bit higher, and that there is no SLI scaling issues.

Here is a review: keep an eye on the FEAR results, as FEAR is a game that has always scaled nearly perfect with SLI. Feel free to check out other games as well, but focus on FEAR, because its a good indication of 9800GTX performance.

http://www.firingsquad.com/har..._performance/page3.asp

After having read that, picture that the 9800GX2 is a card with TWO of these new G92 chips on it. It is clocked slightly lower than the GTX, but I would say we can expect 1.7 - 1.8x the performance of a 9800GTX (without SLI/driver issues on the GX2)

I have to be sincere the GX2 is looking like a damn good card, especially for those with Intel chipsets. And if nvidia got Quad SLI to work this time, all I have to say is ZOMG.

Thats a really selective bench as we all know Fear scales VERY well with multi-GPU setups, ultimately the performance of a card cannot be predicted by a few games as each engine is vastly different, have a look at Crysis (which I believe is the benchmark for the 'next generation') from that same review- the results are pathetic here

I'm sick of waiting for a 'REAL' next generation.....Nvidia/ATI should be looking at games like Crysis, (Fallout 3 and other titles this year also look very demanding so it's not just 'one game').
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: n7


And I have no interest in Crossfire Combo Cards? or SLI Sandwiches?.

Agree with you on that, for sure!

If only crossfire & sli had a 100% success rate
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: n7
Well, i dearly hope you're right, but even if it does performance like the 9600GT SLI, i'm still very disappointed.

And I have no interest in Crossfire Combo Cards? or SLI Sandwiches?.

You have a Q6700, thats a Conroe sandwich. But you bought it cause it performs like 2x Conroe right?

So what if the GX2 turns out to perform like that. Lets say nvidia got the drivers right this time. Would you still be against it? I am totally agreeing with you, but I am just telling you that prejudging a card negatively ahead of real benchmarks is pointless.