- Jul 1, 2005
- 5,529
- 0
- 0
http://www.driverheaven.net/articles/efficiency/
Here is a more "apples to apples" test of the latest cards.
Here is a more "apples to apples" test of the latest cards.
Originally posted by: Acanthus
That article has no bearing on anything. Im not even going to get into it.
Originally posted by: Cooler
Ok They under clocked that card. Why not just OC GTX to ATI speed maybee because they cant without extream cooling. Thus showing the strong point of r520 extream clock speeds.
Originally posted by: Cooler
Ok They under clocked that card. Why not just OC GTX to ATI speed maybee because they cant without extream cooling. Thus showing the strong point of r520 extream clock speeds.
Originally posted by: Cooler
Thus showing the strong point of r520 extream clock speeds.
Originally posted by: Rage187
Originally posted by: Cooler
Thus showing the strong point of r520 extream clock speeds.
thats a weakness, genius.
It's about working smarter not harder, MHZ and GHZ mean crap. If you have to run at a higher clock rate to keep up that means your architecture sucks.
Originally posted by: Rage187
Originally posted by: Cooler
Thus showing the strong point of r520 extream clock speeds.
thats a weakness, genius.
It's about working smarter not harder, MHZ and GHZ mean crap. If you have to run at a higher clock rate to keep up that means your architecture sucks.
Originally posted by: Rage187
Originally posted by: Cooler
Thus showing the strong point of r520 extream clock speeds.
thats a weakness, genius.
It's about working smarter not harder, MHZ and GHZ mean crap. If you have to run at a higher clock rate to keep up that means your architecture sucks.
Vertex shader figures are also interesting, as both cards have 8 vertex shader units and are clocked at 450/1000 we can really see which architecture processes vertex shaders more efficiently. In this test the result goes to the R520 when simple vertex shaders are used. However when more complex shaders are used the G70 just outperforms R520.
However, bearing this in mind, if reports of X1800XL temps are true, the card is smouldering hot. The GF6 and 7 series are inferno's themsleves, mind you, but reports have the X1800XL and XT's being hotter still...
Originally posted by: LTC8K6
So, what will ATI have if they get 20 or 24 "pixel processors" going considering they are doing pretty well with just 16?
They're not goign pretty well with 16
Originally posted by: LTC8K6
So, what will ATI have if they get 20 or 24 "pixel processors" going considering they are doing pretty well with just 16?