AnandTech eVGA GeForce 7900GT KO SC benchmarks

Cenarius

Member
Aug 30, 2001
71
0
0
Reading through Derek Wilson's recent article on the Radeon X1950 XTX & X1900 XT 256MB, I couldn?t help but notice some extraordinarily high benchmarks for the eVGA GeForce 7900GT KO SC. The card is stated to be "clocked at 580/590", compared to the stated 450/660 of a reference 7900GT. I think the eVGA is actually clocked somewhere around 580/790, which is a 29%/20% increase from stock.

With that in mind, how is it possible for the card to score 50% - 70% higher than the stock card in the following two benchmarks?

HL2: Ep1
78.2/51 ? 53%

Quake 4
85.5/49.9 ? 71.3%

Something looks terribly wrong to me.
 

anandtechrocks

Senior member
Dec 7, 2004
760
0
76
Yea, I keep looking at that page. I don't understand how a slightly overclocked 7900GT is beating an X1900XT 512. I'm thinking of picking up a cheap 7900GT and comparing it to my X1900XT 256.
 

Bobthelost

Diamond Member
Dec 1, 2005
4,360
0
0
Don't even try to compare different benchmarks accross reviews. The benchmark run changes and do do the drivers, not to mention the rest of the system.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
I agree that something seems a little off. After looking over the rest of the article and checking some other sites, I still can't quite figure out what's going on.

First off, the eVGA card is basically clocked at reference GTX speeds. So it should bench close to that. From that same article, it does:

Q4 1600x1200 noAA
GTX..........88.3
eVGA GT...85.5

BF2 1600x1200 noAA
GTX..........103.1
eVGA GT.....94.3

The other game benchmarks do the same thing-- show the GTX edging out the GT SC by just a small fraction (which it should, given the clocks speeds are almost identical). The scores logically fit when compared to their GTX scores from the same article. So it passes the first test.

Firing Squad also shows the GT SC ranking right up there with the GTX. Yet, the difference here is that nowhere does it indicate any gains in the 50-70% range over a reference GT. More like 20-30%, which fits perfectly with the clock speed increase. So, it fails test 2.

I haven't done the testing personally, but I would think the GT SC should benchmark right near the reference GTX. And, according to AT's own benches from the GTX/GT launch, at 1600x1200, a GTX should be 20-40% higher than a GT. Which would put the SC GT 20-40% higher than a ref GT. Not 70%. But who knows? Driver improvements, game patches, different CPUs... too much deviation to be sure.

Just my thoughts.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,393
8,552
126
Originally posted by: deadseasquirrel
But who knows? Driver improvements, game patches, different CPUs... too much deviation to be sure.

Just my thoughts.

driver improvements should hit the regular 7900GT the same as they hit the superclock. in % terms at least.

still doesn't look like they've done the HQ driver setting tests. between that and this, i'm not sure i can trust AT's videocard reviews anymore.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Couple of things to note.

1) For the AT article, the benches posted were all run with No AA. I could see how it would be possible for an overclocked GT to best an XT with no AA, but I seriously doubt that it could with 4xAA enabled.

2) It's possible that the graphs at FS have a typo, I think that XFX 7900 GTX XXX is supposed to be XFX 7900 GT XXX.

If you look at the rest of the article, the test setup doesn't show a GTX in the bunch.

ASUS EN7900 GT TOP
BFG GeForce 7900 GT OC
EVGA e-GeForce 7900 GT KO Superclocked
EVGA e-GeForce 7900 GT Signature Series
XFX GeForce 7900 GT XXX Edition
NVIDIA GeForce 7900 GT
 

Cenarius

Member
Aug 30, 2001
71
0
0
Looking through Derek's original benchmarks of the GeForce 7 series, the 7900GTX beats the 7900GT by much smaller margins, with a maximum of 51%. This is totally believable in high res/AA/AF situations, where the GTX's 512MB buffer should come into play (you forgot to mention that in your comparison, deadseasquirrel ;)).

I have now found another review pitting the eVGA against a stock 7900GT, and the results are much more in line with expectations given the clock speed difference ? the eVGA's lead is well under 30%. I think Derek has screwed up on this one, and will send him an e-mail.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Yeah typically it is impossible to get higher then the core clock increase of 29% in actually performance. Much less something outrageous like 70%.
 

1Dark1Sharigan1

Golden Member
Oct 5, 2005
1,466
0
0
From my previous experiences with overclocking the 7900GT, percent increases in the core speeds generally corresponded to the same percent increases in overall performance, with the memory not being a major factor until you do serious volt-modding on the core.
 

Cenarius

Member
Aug 30, 2001
71
0
0
Originally posted by: Ryan Smith
Thanks for the input guys. I'll make sure to bring this up with Derek first-thing on Monday.:)
Yay, the staff do frequent the forums! I didn?t think they did, so I already sent an e-mail.
 

Cenarius

Member
Aug 30, 2001
71
0
0
Originally posted by: anandtechrocks
Any other information on this? Any word from Derek?
Nada. Though I see the clock speed has been corrected to 580/790. I?m disappointed he didn?t notice the anomaly during testing, because he surely would have done or said something about it.

Especially considering his recent article on the 7900GS, where he overclocked it by 6.7%/6% and said the following about Oblivion (at 1024×768):

With Oblivion, we hit over a 15%* increase. This indicates that the latest Elder Scrolls game is very tough on all aspects of a graphics card. The fact that the game reflects a higher than logical performance increase is explained by the high variance of our FRAPS test with Oblivion.
*This figure was reduced to 11% within days of posting the article, when he also added:
Note, that the maximum theoretical performance increase shouldn't simply be memory clock percent increase added to core clock percent increase: it's a much more complex relationship that we can't explain without intimate knowledge of how NVIDIA handles latency hiding, data moving, and scheduling.
 

DerekWilson

Platinum Member
Feb 10, 2003
2,920
34
81
Hi everyone.

We've been paying close attention to this thread and retesting our hardware. There is an issue with Q4 and HL2EP1 numbers, and we are working with EVGA to get ahold of another card. We should have one by the end of the week and will hopefully be able to post retested numbers this weekend.

The rest of the numbers still seem to be accurate at this point, but when we get our new hardware we will rerun all these numbers to confirm.

In the meantime, Q4 and HL2EP1 overclocked numbers have been removed from the article.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,044
3,524
126
Originally posted by: rnp614
I wonder what I can get out of the EVGA 512MB GT KO.

uhhh if thats the case what about my 710mhz xfx 7900gt. :p
On there also in SLI both at 710mhz.

If a 130mhz increase from stock gave them that much of a increase, image my almost 300mhz increase. :eek:

 

anandtechrocks

Senior member
Dec 7, 2004
760
0
76
Originally posted by: DerekWilson
Hi everyone.

We've been paying close attention to this thread and retesting our hardware. There is an issue with Q4 and HL2EP1 numbers, and we are working with EVGA to get ahold of another card. We should have one by the end of the week and will hopefully be able to post retested numbers this weekend.

The rest of the numbers still seem to be accurate at this point, but when we get our new hardware we will rerun all these numbers to confirm.

In the meantime, Q4 and HL2EP1 overclocked numbers have been removed from the article.

Any updated progress on this Derek? Thanks!

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: sum1
Originally posted by: anandtechrocks
Any other information on this? Any word from Derek?
Nada. Though I see the clock speed has been corrected to 580/790. I?m disappointed he didn?t notice the anomaly during testing, because he surely would have done or said something about it.

Especially considering his recent article on the 7900GS, where he overclocked it by 6.7%/6% and said the following about Oblivion (at 1024×768):

With Oblivion, we hit over a 15%* increase. This indicates that the latest Elder Scrolls game is very tough on all aspects of a graphics card. The fact that the game reflects a higher than logical performance increase is explained by the high variance of our FRAPS test with Oblivion.
*This figure was reduced to 11% within days of posting the article, when he also added:
Note, that the maximum theoretical performance increase shouldn't simply be memory clock percent increase added to core clock percent increase: it's a much more complex relationship that we can't explain without intimate knowledge of how NVIDIA handles latency hiding, data moving, and scheduling.

I'm targetting the last paragraph here. it may help to compare with n40, since the difference is IMO likely to be found in the ROPS to fragment pipeline ratio (g7x doesn't have a 1:1 rop/fragment pipeline ratio, nv40 does).
 

anandtechrocks

Senior member
Dec 7, 2004
760
0
76
Originally posted by: DerekWilson
Hi everyone.

We've been paying close attention to this thread and retesting our hardware. There is an issue with Q4 and HL2EP1 numbers, and we are working with EVGA to get ahold of another card. We should have one by the end of the week and will hopefully be able to post retested numbers this weekend.

The rest of the numbers still seem to be accurate at this point, but when we get our new hardware we will rerun all these numbers to confirm.

In the meantime, Q4 and HL2EP1 overclocked numbers have been removed from the article.

I hate to keep bumping this, but i'm really interested in the updated results. Any news?