JackBurton
Lifer
- Jul 18, 2000
 
- 15,993
 
- 14
 
- 81
 
The 9700 Pro should look REALLY good once the GeForce FX benchmarks come out. ATi is going to be selling the 9700 Pro as a package: you get a fast as hell video card PLUS an extra PCI slot. 
			
			Anyone else find it odd that this website which has a forum with a grand total of 3,000 posts, also claims to have tested a Prototype Hammer CPU
	I guess this site is getting HAMMERED (as it's the only review up I can find) with some pages not loading.Originally posted by: GTaudiophile
ATI has certainly raised the stakes with Radeon 9700 Pro, and if GeForceFX was coming into this brawl looking for a knockout, it didn't get one. Actually, what we saw is that the memory sub-system of GeForceFX hits a pretty hard wall when you combine a high resolution and bandwidth-hungry rendering features like FSAA and AF...
For nVidia, GeForceFX represents a return to at the very least performance parity with ATI. For ATI however, Radeon 9700 Pro looks strong versus GeForceFX, and these results show just how much performance ground nVidia had lost to ATI when Radeon 9700 Pro first shipped.
3DMark 2001SE Overall Results
What we generally found is that in the baseline tests, the GeForceFX's 54% engine clock advantage gave it a clear advantage on those tests that stressed the GPU more than the graphics memory. On the other hand, when we piled on both FSAA and AF, the GeForceFX's memory interface showed that it has the potential to bottleneck this GPU's many and powerful processing engines.
Baseline: GeForceFX ahead by 6%
With FSAA & AF: Radeon 9700 Pro ahead by 30%
Fill Rate Performance:
Baseline Single-Texture: Radeon 9700 Pro ahead by 15%
Single-Texture with FSAA & AF: Radeon 9700 Pro ahead by 33%
Baseline Multi-Texture: GeForceFX ahead by 39%
Multi-Texture with FSAA & AF: Radeon 9700 Pro ahead by 21%
WTH were those jackarses thinking by not using a 256 bit interface. 3dLabs, ATI, and even the jabrones at Matrox figured it out, Hell I think BitBoys had a 256 bit interface . WTF was Nvidia thinking
Heaps of people do, including myself. My standard gaming resolution is 1600 x 1200 x 32 with 16x anisotropic but I'll go to 1792 x 1344 if the game is fast enough.How many of you actually run games at 1600x1200x32???
It must really bug you that the NV30 is such a big DISAPPOINTMENT - at least for those expecting a "Radeon Killer".Originally posted by: chizow
Jumping the gun again folks...read the fine print:
We would normally have put the GeForceFX through our usual battery of tests at the two standard test resolutions, 1024x768x32 and 1600x1200x32. But this being the clash of the titans, we wanted to really sock it to 'em. So, with that in mind, and because of the severely limited time window in which we had to test, we tested at 1600x1200x32, and gathered baseline values without either FSAA or Anisotropic Filtering (AF) enabled. We then added 4X FSAA and 8X AF to both the GeForceFX and the Radeon 9700 Pro to see who would it would hit harder.
How many of you actually run games at 1600x1200x32??? Maybe I should re-phrase and ask how many of you actually have monitors that support 1600x1200 w/out the use of a magnifying glass.![]()
I'll wait til a thorough review is by a reputable site.
Chiz
Looks like their drivers might not be up-to-par either.NHL 2002 gave the GeForceFX fits, and ultimately wouldn't run. The error we repeatedly got is one we've seen testing on ATI hardware as well, so we don't believe it to be specific to the GeForceFX. But, as a result of our inability to get a complete test run in, we dropped NHL 2002 from this round of testing for both cards.
	I would tend to agree Chiz, but it really is amazing that a company with the size and reputation of Nvidia cannot decisively beat an aged product 6 months after its release. Nvidia has never been in this position before, it will be interesting to see what they do with the market positoning/pricing of this card as it surely is not dominent, and will not command a premium price over the Radeons.Originally posted by: chizow
Jumping the gun again folks...read the fine print:
We would normally have put the GeForceFX through our usual battery of tests at the two standard test resolutions, 1024x768x32 and 1600x1200x32. But this being the clash of the titans, we wanted to really sock it to 'em. So, with that in mind, and because of the severely limited time window in which we had to test, we tested at 1600x1200x32, and gathered baseline values without either FSAA or Anisotropic Filtering (AF) enabled. We then added 4X FSAA and 8X AF to both the GeForceFX and the Radeon 9700 Pro to see who would it would hit harder.
How many of you actually run games at 1600x1200x32??? Maybe I should re-phrase and ask how many of you actually have monitors that support 1600x1200 w/out the use of a magnifying glass.![]()
I'll wait til a thorough review is by a reputable site.
Chiz
Originally posted by: BFG10K
Heaps of people do, including myself. My standard gaming resolution is 1600 x 1200 x 32 with 16x anisotropic but I'll go to 1792 x 1344 if the game is fast enough.How many of you actually run games at 1600x1200x32???
It must really bug you that the NV30 is such a big DISAPPOINTMENT - at least for those expecting a "Radeon Killer".
	Originally posted by: Snoop
I would tend to agree Chiz, but it really is amazing that a company with the size and reputation of Nvidia cannot decisively beat an aged product 6 months after its release. Nvidia has never been in this position before, it will be interesting to see what they do with the market positoning/pricing of this card as it surely is not dominent, and will not command a premium price over the Radeons.
IMO, Nvidia has lost its edge (which I do not think they will get back soon).
I'm gonna be selling my stock ASAP.![]()
This is what bothers me, they had PLENTY of time to get this thing right, yet it cant beat a previous generation product in the majority of cases. Why didnt Nvidia continue to tweak this design while getting the .13 micron process down? SOMEONE at Nvidia needs to explain why they could not come up with a memory sub-system which can match the available bandwidth of all of the other major Video card Mfg's.Snoop, don't forget, NV30's specs were finalized almost a year ago.
Good point. The fact that these cards are able to play games with high framerates in conjunction with all the IQ settings maxed out is a testament to that. The only problem is there is no way to test for future games without early build releases for benchmarking purposes.Most existing games out on the market don't need half as this much graphics processing power to perform decently.
Originally posted by: apoppin
It must really bug you that the NV30 is such a big DISAPPOINTMENT - at least for those expecting a "Radeon Killer".
Posted by: chizow
No, it doesn't bother me at all, I'll just keep buying the highest performing compatible part available. And the rest of the fanboys will continue to buy 2-generation old value parts from their favorite team and then talk about the latest product as if they know something about it.![]()
	