- Oct 13, 2004
- 24,778
- 4
- 0
It talks the talk but it can't walk the walk...
I compared Source Stress test scores I did here for my 6600GT (banshee overclocked, 583/1140) to some runs at the same settings with my 3450 (680/1130):
http://forums.anandtech.com/me...ab=arc&highlight_key=y
Here ya go:
CSS Stress Test
Firstly at 1280x960, 2xAA, 8xAF, everything else maxed in the game video options.
6600GT HQ 3450 Optimal Quality, Catalyst AI maxed
64.73 40.83
1280x960 with 4xAA/16AF, everything else maxed in the game video options.
6600GT HQ 3450 Optimal Quality, Catalyst AI maxed
51.18 38.93
Now at 1600x1200, 2xAA, 8xAF, everything else maxed in the game video options.
6600GT HQ 3450 Optimal Quality, Catalyst AI maxed
21.35 28.06
Bit dumb, it's faster at resolutions you couldn't play
I also tried farcry, and whereas I could play with my 6600GT and 9800 Pro at 1024/2xAA/8xAF/everything maxed I couldn't with this... sad face
I was shocked at how bad this card was in the real world given the 3DMark05 score (4k).
It's also interesting how it drops off much more slowly (relatively) to the 6600GT (pity it starts at such a bad place!). Architecture changes mean it handles AA/AF more efficiently?
I compared Source Stress test scores I did here for my 6600GT (banshee overclocked, 583/1140) to some runs at the same settings with my 3450 (680/1130):
http://forums.anandtech.com/me...ab=arc&highlight_key=y
Here ya go:
CSS Stress Test
Firstly at 1280x960, 2xAA, 8xAF, everything else maxed in the game video options.
6600GT HQ 3450 Optimal Quality, Catalyst AI maxed
64.73 40.83
1280x960 with 4xAA/16AF, everything else maxed in the game video options.
6600GT HQ 3450 Optimal Quality, Catalyst AI maxed
51.18 38.93
Now at 1600x1200, 2xAA, 8xAF, everything else maxed in the game video options.
6600GT HQ 3450 Optimal Quality, Catalyst AI maxed
21.35 28.06
Bit dumb, it's faster at resolutions you couldn't play
I also tried farcry, and whereas I could play with my 6600GT and 9800 Pro at 1024/2xAA/8xAF/everything maxed I couldn't with this... sad face
I was shocked at how bad this card was in the real world given the 3DMark05 score (4k).
It's also interesting how it drops off much more slowly (relatively) to the 6600GT (pity it starts at such a bad place!). Architecture changes mean it handles AA/AF more efficiently?