- Oct 18, 2001
- 4,646
- 0
- 71
While looking at Anand's review of the new Geforce 4, the benchmarks seemed a little off on the Unreal 2 Demo from the article just a little while back. I understand that the tests use different builds, but some of the increases seem extreme.
In the older review with build 848, we get 1024X768 32bits with:
Radeon 8500 - 51.7 fps
GF3 Ti500 - 48.6 fps
GF3 - 41.0 fps
Found here
With the review today, using build 856, at 1024X768 32bits we have:
GF4 Ti4600 - 85.6 fps
GF3 Ti500 - 65.3 fps
GF3 - 62.8 fps
Radeon 8500 - 58.7 fps
Found here
So the GF3 goes up 22 fps, over a 50% performance boost. The Ti500 goes up 16 fps, about a 33% performance boost. Anand explains that the newer Radeon drivers decrease performance in the benchmark, but the Radeon went up 7 fps.
I cant imagine the Unreal engine going through such a huge optimization for the GF3 in just a few builds, especially so close to the release date of Unreal 2, and I dont think card specific optimizations could result in a 50% boost in performance, unless there were absolutely no optimizations in the older build. And why the Radeon takes a performance "hit" with the new drivers but still gets higher fps, I dont know.
Ive read other talk about the GF3 Ti500 showing up with nearly 100% performance gains over the GF3, as seen in the Unreal 2 Quincuxx, and the Unreal 2 Anisotropic Filtering test.
Is it the benchmark test (Unreal 2 engine) thats screwy, or just some other issue?
In the older review with build 848, we get 1024X768 32bits with:
Radeon 8500 - 51.7 fps
GF3 Ti500 - 48.6 fps
GF3 - 41.0 fps
Found here
With the review today, using build 856, at 1024X768 32bits we have:
GF4 Ti4600 - 85.6 fps
GF3 Ti500 - 65.3 fps
GF3 - 62.8 fps
Radeon 8500 - 58.7 fps
Found here
So the GF3 goes up 22 fps, over a 50% performance boost. The Ti500 goes up 16 fps, about a 33% performance boost. Anand explains that the newer Radeon drivers decrease performance in the benchmark, but the Radeon went up 7 fps.
I cant imagine the Unreal engine going through such a huge optimization for the GF3 in just a few builds, especially so close to the release date of Unreal 2, and I dont think card specific optimizations could result in a 50% boost in performance, unless there were absolutely no optimizations in the older build. And why the Radeon takes a performance "hit" with the new drivers but still gets higher fps, I dont know.
Ive read other talk about the GF3 Ti500 showing up with nearly 100% performance gains over the GF3, as seen in the Unreal 2 Quincuxx, and the Unreal 2 Anisotropic Filtering test.
Is it the benchmark test (Unreal 2 engine) thats screwy, or just some other issue?