Originally posted by: Piuc2020
If it's multicore... what the hell is nvidia's motivation to release a 9800GX2 card... to be followed by an almost identical dual 9800 card in less than 2 weeks... come on guys, NV is smarter than this and so should you instead of spreading the FUD like it's peanut butter.
Unless of course there is actually a reason why nvidia would choose to release almost identically performing multi-gpu cards less than 2 weeks apart from each other with different names...
Originally posted by: JAG87
you guys are retarded if you think this card is going to perform like an 8800GT or GTS. the 9600GT has only 64 shaders / 32 TMUs, while the GT has 112 shaders / 52 TMUs, and the GTS has 128 shaders / 64 TMUs
yet the 9600GT is practically on par with the 8800GT and not too far behind the GTS. imagine a card with 128 of the same shaders that the 9600GT has and 64 TMUs. I think we are looking at almost double the performance of an 8800GT. This is not the same core that's in the 8800GT and GTS.
and if this revised G92 core is going into the GX2 as well, boy thats going to be one heck of a card. up to 4 times faster than a stock 8800GT. my only grip is that quad SLI will probably suck donkey balls again, and that 2 GTXs in SLI will end up being the fastest graphics setup again.
anyone recall the 7900 series, sound familiar?
PS. I am still shaking my head at the 256 bit bus and 512 MB memory. all this graphics horsepower will probably get destroyed at high resolutions.
Because they still have this plan: GDDR3 -> GDDR5 directly. Expect G100-highend cards to be first GDDR5-products from Nvidia [Late Q2 / early Q3]Originally posted by: Azn
Why doesn't Nvidia move to GDDR4 already?
That card is 50mhz faster in the shader and nothing else.. I hope they have some uber tweaks or some magic drivers.
Originally posted by: jaredpace
I hope all these damn specification rumors & speculation are false/fake. Otherwise the r700 better kick serious ass. Otherwise there is no competition. This really sucks. the 9xxx series is looking like a launch on the opposite end of the spectrum when compared to R300 & g80. Its more like a Geforce FX launch.
Originally posted by: Rusin
Because they still have this plan: GDDR3 -> GDDR5 directly. Expect G100-highend cards to be first GDDR5-products from Nvidia [Late Q2 / early Q3]Originally posted by: Azn
Why doesn't Nvidia move to GDDR4 already?
That card is 50mhz faster in the shader and nothing else.. I hope they have some uber tweaks or some magic drivers.
Originally posted by: munky
Originally posted by: JAG87
you guys are retarded if you think this card is going to perform like an 8800GT or GTS. the 9600GT has only 64 shaders / 32 TMUs, while the GT has 112 shaders / 52 TMUs, and the GTS has 128 shaders / 64 TMUs
yet the 9600GT is practically on par with the 8800GT and not too far behind the GTS. imagine a card with 128 of the same shaders that the 9600GT has and 64 TMUs. I think we are looking at almost double the performance of an 8800GT. This is not the same core that's in the 8800GT and GTS.
and if this revised G92 core is going into the GX2 as well, boy thats going to be one heck of a card. up to 4 times faster than a stock 8800GT. my only grip is that quad SLI will probably suck donkey balls again, and that 2 GTXs in SLI will end up being the fastest graphics setup again.
anyone recall the 7900 series, sound familiar?
PS. I am still shaking my head at the 256 bit bus and 512 MB memory. all this graphics horsepower will probably get destroyed at high resolutions.
The 8800gt is already bottlenecked by the 256-bit bus, that's why the 9600gt is so close, IMO. The 8800gt is about 27% faster on average without AA, and the lead shrinks to only 15% with AA enabled. Which makes these rumored specs even more ridiculous, seeing how it's just an OC'd 8800gts with an extra SLI connector, if these specs are correct.
Originally posted by: JAG87
Originally posted by: munky
Originally posted by: JAG87
you guys are retarded if you think this card is going to perform like an 8800GT or GTS. the 9600GT has only 64 shaders / 32 TMUs, while the GT has 112 shaders / 52 TMUs, and the GTS has 128 shaders / 64 TMUs
yet the 9600GT is practically on par with the 8800GT and not too far behind the GTS. imagine a card with 128 of the same shaders that the 9600GT has and 64 TMUs. I think we are looking at almost double the performance of an 8800GT. This is not the same core that's in the 8800GT and GTS.
and if this revised G92 core is going into the GX2 as well, boy thats going to be one heck of a card. up to 4 times faster than a stock 8800GT. my only grip is that quad SLI will probably suck donkey balls again, and that 2 GTXs in SLI will end up being the fastest graphics setup again.
anyone recall the 7900 series, sound familiar?
PS. I am still shaking my head at the 256 bit bus and 512 MB memory. all this graphics horsepower will probably get destroyed at high resolutions.
The 8800gt is already bottlenecked by the 256-bit bus, that's why the 9600gt is so close, IMO. The 8800gt is about 27% faster on average without AA, and the lead shrinks to only 15% with AA enabled. Which makes these rumored specs even more ridiculous, seeing how it's just an OC'd 8800gts with an extra SLI connector, if these specs are correct.
I am going to have to disagree with that as well, the 8800GT is practically neck to neck with the 8800GTX, and the the 8800GTX has a 384 bit bus. The 8800GTX only shows its muscle at 1920x1200 with 8xAA or 2560x1600 with 4xAA, otherwise they are totally even. How is it that the 8800GT is bottlenecked again? At super duper high-res, yes I agree with you, but thats not what most people are concerned about. I am, but not the average Joe.
Reading what Sylvanas pointed out, maybe nvidia found a way to make a 256 bit bus suffice and all they did was take G92 and optimize its architecture to make it as efficient as G94. I stand by my "twice as fast as 8800GT/GTS" prediction. I just hope that this compression technology thinga majig combined with a piss poor 256 bit bus doesn't crap out at 2560x1600, because that would be a real shame for a high-end card.
On the good side, the card should be cheaper because 256 bit bus is much cheaper to produce than 384 bit. Considering the GX2 is rumored to cost 599 US, the GTX should end up around 499-549 US, which is below the 8800GTX launch price of 599 US.
Originally posted by: n7
This is utter bullshit.
256 bit?
512 MB?
WTF kinda downgrading garbage is this?! :|
1.5 yrs. after the 8800 GTX we get a bandwidth crippled piece of crap?
I'm more than unhappy if this is really true.
Sure, for the average Joe with his 22" monitor (no offence to those with 22"ers), this is great, but some of us run large displays & like AA.
This is like the biggest joke ever. :frown:
Unbelievable.
Originally posted by: n7
This is utter bullshit.
256 bit?
512 MB?
WTF kinda downgrading garbage is this?! :|
1.5 yrs. after the 8800 GTX we get a bandwidth crippled piece of crap?
I'm more than unhappy if this is really true.
Sure, for the average Joe with his 22" monitor (no offence to those with 22"ers), this is great, but some of us run large displays & like AA.
This is like the biggest joke ever. :frown:
Unbelievable.
Originally posted by: JAG87
Here is another gem for you guys: you wanna know how the 9800 GTX is going to perform? Look at 9600GT SLI results. 64 shades and 32TMUs x 2 = 128 shaders and 64TMUs
Keep in mind that SLI results are usually 1.8x the performance of one card, while the GTX will probably be more than 2x the performance of a 9600GT. Consider that all the clock speeds will be a tiny bit higher, and that there is no SLI scaling issues.
Here is a review: keep an eye on the FEAR results, as FEAR is a game that has always scaled nearly perfect with SLI. Feel free to check out other games as well, but focus on FEAR, because its a good indication of 9800GTX performance.
http://www.firingsquad.com/har..._performance/page3.asp
After having read that, picture that the 9800GX2 is a card with TWO of these new G92 chips on it. It is clocked slightly lower than the GTX, but I would say we can expect 1.7 - 1.8x the performance of a 9800GTX (without SLI/driver issues on the GX2)
I have to be sincere the GX2 is looking like a damn good card, especially for those with Intel chipsets. And if nvidia got Quad SLI to work this time, all I have to say is ZOMG.
PS. as you can see, my estimate of 2x 8800GT performance are quite on target
Originally posted by: n7
Eh, i seriously don't believe they're pulling this kinda crap.
I thought AMD was far from ever taking the performance crown back (X2 doesn't count IMO), but wow, if this is the kinda BS nVidia wants to pull, they won't see any of my money.
I realize this card will fly @ lower resolutions, but flagship cards aren't supposed to fly @ low resolutions, they're supposed to dominate for high end setups.
There is simply no possible way for this not to choke @ 2560x1600, since 256 bit + 512 MB is just not enough.
No amount of magical tweaking is going to fix that.
I gotta say i am beyond belief...truely remarkable how retarded this is.
Originally posted by: n7
Eh, i seriously don't believe they're pulling this kinda crap.
I thought AMD was far from ever taking the performance crown back (X2 doesn't count IMO), but wow, if this is the kinda BS nVidia wants to pull, they won't see any of my money.
I realize this card will fly @ lower resolutions, but flagship cards aren't supposed to fly @ low resolutions, they're supposed to dominate for high end setups.
There is simply no possible way for this not to choke @ 2560x1600, since 256 bit + 512 MB is just not enough.
No amount of magical tweaking is going to fix that.
I gotta say i am beyond belief...truely remarkable how retarded this is.
Originally posted by: jaredpace
Originally posted by: JAG87
Here is another gem for you guys: you wanna know how the 9800 GTX is going to perform? Look at 9600GT SLI results. 64 shades and 32TMUs x 2 = 128 shaders and 64TMUs
Keep in mind that SLI results are usually 1.8x the performance of one card, while the GTX will probably be more than 2x the performance of a 9600GT. Consider that all the clock speeds will be a tiny bit higher, and that there is no SLI scaling issues.
Here is a review: keep an eye on the FEAR results, as FEAR is a game that has always scaled nearly perfect with SLI. Feel free to check out other games as well, but focus on FEAR, because its a good indication of 9800GTX performance.
http://www.firingsquad.com/har..._performance/page3.asp
After having read that, picture that the 9800GX2 is a card with TWO of these new G92 chips on it. It is clocked slightly lower than the GTX, but I would say we can expect 1.7 - 1.8x the performance of a 9800GTX (without SLI/driver issues on the GX2)
I have to be sincere the GX2 is looking like a damn good card, especially for those with Intel chipsets. And if nvidia got Quad SLI to work this time, all I have to say is ZOMG.
PS. as you can see, my estimate of 2x 8800GT performance are quite on target
If thats the case, It should perform just like an 8800ultra in crysis at 1920x1200
http://www.firingsquad.com/har..._performance/page9.asp
Originally posted by: JAG87
Here is another gem for you guys: you wanna know how the 9800 GTX is going to perform? Look at 9600GT SLI results. 64 shades and 32TMUs x 2 = 128 shaders and 64TMUs
Keep in mind that SLI results are usually 1.8x the performance of one card, while the GTX will probably be more than 2x the performance of a 9600GT. Consider that all the clock speeds will be a tiny bit higher, and that there is no SLI scaling issues.
Here is a review: keep an eye on the FEAR results, as FEAR is a game that has always scaled nearly perfect with SLI. Feel free to check out other games as well, but focus on FEAR, because its a good indication of 9800GTX performance.
http://www.firingsquad.com/har..._performance/page3.asp
After having read that, picture that the 9800GX2 is a card with TWO of these new G92 chips on it. It is clocked slightly lower than the GTX, but I would say we can expect 1.7 - 1.8x the performance of a 9800GTX (without SLI/driver issues on the GX2)
I have to be sincere the GX2 is looking like a damn good card, especially for those with Intel chipsets. And if nvidia got Quad SLI to work this time, all I have to say is ZOMG.
Originally posted by: n7
And I have no interest in Crossfire Combo Cards? or SLI Sandwiches?.
Originally posted by: n7
Well, i dearly hope you're right, but even if it does performance like the 9600GT SLI, i'm still very disappointed.
And I have no interest in Crossfire Combo Cards? or SLI Sandwiches?.