Fishy Review...?

unr3al

Senior member
Jun 10, 2008
214
1
81
www.link-up.co.za
Okay so this isn't a brand-new chipset with brand-new features and beyond 10k in 3DMark06, but for my needs it works a charm.

I was googling for reviews on this (imho exceptional) little card since I found one for an amazing price. R677 with delivery. Okay so you folks overseas that might sound crazy (around $80-85). For SA it was amazing, anyway. A GeForce 8500GT is in the same price range here and this one had potentially the best performance for the price by quite a long shot that I could find anywhere. It destroys the 8500GT, if that is any clarification...

Anyway, via the mighty Google I happened to find only one review for the card, right here. The reviewer couldn't "overclock one MHz" (I quote that directly). Well, here are my overclocking results so far (I haven't maxed the card out yet):

Core: From 630MHz it is now as I write this 660MHz.
Memory: From 400MHz (800MHz DDR) it is now 450MHz (900MHz DDR).

My 3DMark 2006 v1.1.0 score went up from 4088 to well beyond 4300. Sadly I don't have a picture to prove it (I copied and pasted something else, after pressing Print Screen and before saving the screenshot *blushes*) but I could run it again if anyone doesn't believe me.

NFSMW @ 1024x768 all in-game settings maxed out except AA gave me 30fps minimum, 38fps maximum and 34.950fps average.

I am just wondering, did the reviewer perhaps consider using RivaTuner (CCC couldn't overclock it at all) or did he simply use CCC? And ATiTool locks up once I try selecting the correct device *rolls eyes*, as it did with my IGP.

I raised NFSMW's in-game AA setting one notch (there are two notches; if you played with the settings you should know) and still get a minimum fps above 20. Which is just about the maximum I got with my IGP at the absolute minimum settings available (bar using the performance tuner application). Maxing that AA setting gives me image quality I haven't personally seen in NFSMW before now, but it also drops framerates low enough to be just a tad too sluggish to my liking.

I will have some Crysis benchmarks once I get my game back (lent it to someone) but looking at the performance increases vs my IGP I think I will have some great results. For the price I paid, anyway. I will be able to finish the game at last!

A final note; on a card as budget as this, why benchmark at any resolution higher than 1024x768? Let alone 1600x1200! It just doesn't place the card in the right perspective.
 

WT

Diamond Member
Sep 21, 2000
4,816
60
91
I'm actually using (errr, well, I have one, but I never USE it) the AGP version of this card and found its problematic in some boards that I plopped it into. I swapped it out with an eVGA 6800 card and all of the lockups in CoD2 went away. Trying to find AGP drivers for it on the AMD/ATI site, they don't EVEN SUPPORT THE CARD with drivers ! Yea, there's a driver for the PCIe version, but not AGP. And don't jump on me for not upgrading to PCIe please, this is PC #7 in my hardware lineup .. I gotta keep one AGP rig alive, just for kicks.
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: unr3al


I will have some Crysis benchmarks once I get my game back.

That is something to see and laugh. :laugh:
 

james1701

Golden Member
Sep 14, 2007
1,791
34
91
Originally posted by: error8
Originally posted by: unr3al


I will have some Crysis benchmarks once I get my game back.

That is something to see and laugh. :laugh:


He should be able to run in on medium settings pretty good depending on his resolution and in DX9 mode.
 

unr3al

Senior member
Jun 10, 2008
214
1
81
www.link-up.co.za
Originally posted by: error8
Originally posted by: unr3al


I will have some Crysis benchmarks once I get my game back.

That is something to see and laugh. :laugh:

I have already stated that I game at 1024x768 and considering that a 7600GT (not really THAT much faster, especially at my current clocks) runs the game just fine at 1280x1024 medium quality, I think I should have a jolly time at the same IQ settings but at my resolution. I will try running it DX9 maxed though, just to see how it looks.

This isn't a debate about whether my card is great or not (to me it is great, anyway) but I am questioning the thoroughness of quoted reviewer. I mean, NOTHING would overclock. Why? I've achieved some remarkable results already. Remarkable for a passively cooled card without a single heatpipe, anyway.

I can't get my images to upload; neither photobucket nor tinypic wants to work with me (I guess my connection is at fault) but as soon as I succeed I'll post links.

By the way WT, have you looked on the manufacturer's site for drivers? I wasn't aware that you need different sets of drivers for the AGP and PCI-e versions.
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
I understand what are you trying to do, but I laugh because I know that Crysis will work like crap. I've played it on my 7600 GT and I couldn't get anything playable out of it at 1024X768 and 7600 GT is way faster then this 2600pro.

I remember that 2600 XT used to be at about the same level with the 7600 GT in older games and a bit faster in recent ones, but that videocard had ddr4 opposed to your ddr2. So I don't want to imagine Crysis on this card. ;)
 

unr3al

Senior member
Jun 10, 2008
214
1
81
www.link-up.co.za
I've personally seen Crysis on a 7600GT, it runs fine on medium settings @ 1280x1024. Yes it runs like crap if you max it out (the performance dip between medium and high is insane compared to the iq increase), but why on earth would I want to do that? Its a budget card. I plan on handling it like such. On my IGP I got 10fps when I played with everything set to minimum @ 800x600. This card is 8-10 times as fast. It will run just fine at the settings I plan to use, trust me. I'm getting my game back tomorrow so I'll post some results.
 

WT

Diamond Member
Sep 21, 2000
4,816
60
91
Above link gives me a support portal session expired. I've put the 2600 back on the shelf and I'll putz aound with it over late Fall ... too busy with other stuff now.
 

tcsenter

Lifer
Sep 7, 2001
18,922
560
126
Originally posted by: WT
Above link gives me a support portal session expired. I've put the 2600 back on the shelf and I'll putz aound with it over late Fall ... too busy with other stuff now.
I hate ATI's session-based bullsh-t support site. Try this:

http://support.ati.com/ics/sup...ledge&questionID=31625

http://support.ati.com/ics/sup...ledge&questionID=31542

Alternatively, download the Catalyst Hotfix drivers for AGP cards from Sapphire Tech. They are exactly the same as I'm trying to link you direct from ATI:

Catalyst 8.8 AGP VISTA 32-bit (44MB)

Catalyst 8.8 AGP VISTA 64-bit (60MB)

Catalyst 8.8 AGP XP 32-bit (38MB)

Catalyst 8.8 AGP XP 64-bit (34MB)
 

unr3al

Senior member
Jun 10, 2008
214
1
81
www.link-up.co.za
I haven't played any game before that looks as good as this. Its simply amazing lol

64-bit Crysis; Graphics Benchmark - 1024x768 medium quality - 32fps average. Would be higher if I had more RAM. One gig is simply not enough.

CPU benchmark 1 @ 800x600 medium quality got me 37fps average.

CPU benchmark 2 @ 800x600 medium quality; 15fps average lol but its a crazy benchmark.

Remember that before now I got 10fps average @ 800x600 minimum quality in the graphics bench . . .

I'm happy :-D
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: unr3al
I've personally seen Crysis on a 7600GT, it runs fine on medium settings @ 1280x1024. Yes it runs like crap if you max it out (the performance dip between medium and high is insane compared to the iq increase), but why on earth would I want to do that? Its a budget card. I plan on handling it like such. On my IGP I got 10fps when I played with everything set to minimum @ 800x600. This card is 8-10 times as fast. It will run just fine at the settings I plan to use, trust me. I'm getting my game back tomorrow so I'll post some results.
no way in hell is Crysis "playable" on medium settings at 1280x1024 with a 7600gt. it only got 16fps at 800x600 on medium settings and thats in a much better system than yours. they didnt even bother to bench at 1024x768 much less 1280x1024 because it wasnt even playable at 800x600. the 2600xt is a much faster card in modern games. http://www.gamespot.com/features/6182806/p-5.html
 

unr3al

Senior member
Jun 10, 2008
214
1
81
www.link-up.co.za
Originally posted by: toyota no way in hell is Crysis "playable" on medium settings at 1280x1024 with a 7600gt. it only got 16fps at 800x600 on medium settings and thats in a much better system than yours. they didnt even bother to bench at 1024x768 much less 1280x1024 because it wasnt even playable at 800x600. the 2600xt is a much faster card in modern games. http://www.gamespot.com/features/6182806/p-5.html

A much better system than mine and 16fps, 800x600 medium IQ? LoL No offense to gamespot or their supporters but did that system perhaps secretly use a Celeron? If not, there was something seriously wrong with that X6800... I re-ran the Crysis benchmark tool today with a heavily fragmented HDD with 5GB free space left and I got 30fps average @ 1024x768 medium IQ! In the GPU benchmark. That they used. The 7600GT is a bit faster than my 2600PRO. This I know for a fact.

Here is a screenshot of the Crysis bench tool and my results, both @ 800x600 and 1024x768 medium IQ.

Here is a screenshot of Assassin's Creed's system analyzer, just to prove that I actually have what I say I have.

No way a 7600GT is going to struggle with Crysis @ 800x600, sorry to disappoint you. It should actually be just below playable if you ran it @ 800x600 High IQ. I happen to know because I've SEEN it running on a system with but a 7600GT and an AMD 5000. And if the game was completed several times by the owner of the system @ medium IQ, 1152x864, that already blows the entire argument out of the water. Good grief he did it twice on Delta! Don't tell me he didn't have playable framerates...

Note though, you CANNOT look at the very first loop of the GPU benchmark, because the HDD is your limiting factor there. My card maxed out only after the third run. The first run specifically had some crazy disk thrashing. The numbers you get after the third run is comparable to in-game framerates (at least on my system).

And another note, you have an 8600GT. Which is only very slightly faster than a 7600GT. Fact. Whether you like it or not. Don't tell me you play Crysis @ 800x600 and get 17fps...

Really I don't want to make you look like a fool and I'm not trying to do anything of the kind (I really hope you aren't trying to do that to me), but I have been working with PCs for roughly 9 years now, building my own systems, servicing other systems, overclocking, troubleshooting, gaming, benchmarking etc (completed both an A+ and a Network+ course in the process) and I think my knowledge is sufficient to know how to benchmark a game.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: unr3al
Originally posted by: toyota no way in hell is Crysis "playable" on medium settings at 1280x1024 with a 7600gt. it only got 16fps at 800x600 on medium settings and thats in a much better system than yours. they didnt even bother to bench at 1024x768 much less 1280x1024 because it wasnt even playable at 800x600. the 2600xt is a much faster card in modern games. http://www.gamespot.com/features/6182806/p-5.html

A much better system than mine and 16fps, 800x600 medium IQ? LoL No offense to gamespot or their supporters but did that system perhaps secretly use a Celeron? If not, there was something seriously wrong with that X6800... I re-ran the Crysis benchmark tool today with a heavily fragmented HDD with 5GB free space left and I got 30fps average @ 1024x768 medium IQ! In the GPU benchmark. That they used. The 7600GT is a bit faster than my 2600PRO. This I know for a fact.

Here is a screenshot of the Crysis bench tool and my results, both @ 800x600 and 1024x768 medium IQ.

Here is a screenshot of Assassin's Creed's system analyzer, just to prove that I actually have what I say I have.

No way a 7600GT is going to struggle with Crysis @ 800x600, sorry to disappoint you. It should actually be just below playable if you ran it @ 800x600 High IQ. I happen to know because I've SEEN it running on a system with but a 7600GT and an AMD 5000. And if the game was completed several times by the owner of the system @ medium IQ, 1152x864, that already blows the entire argument out of the water. Good grief he did it twice on Delta! Don't tell me he didn't have playable framerates...

Note though, you CANNOT look at the very first loop of the GPU benchmark, because the HDD is your limiting factor there. My card maxed out only after the third run. The first run specifically had some crazy disk thrashing. The numbers you get after the third run is comparable to in-game framerates (at least on my system).

And another note, you have an 8600GT. Which is only very slightly faster than a 7600GT. Fact. Whether you like it or not. Don't tell me you play Crysis @ 800x600 and get 17fps...

Really I don't want to make you look like a fool and I'm not trying to do anything of the kind (I really hope you aren't trying to do that to me), but I have been working with PCs for roughly 9 years now, building my own systems, servicing other systems, overclocking, troubleshooting, gaming, benchmarking etc (completed both an A+ and a Network+ course in the process) and I think my knowledge is sufficient to know how to benchmark a game.
nope I get 30fps just like they got in the benchmark. perhaps you should actually look at benchmarks because the 8600gt is twice as fast as the 7600gt in modern games.


http://www.gamespot.com/features/6182806/p-5.html
Crysis 1024x768 Medium Quality
8600GT 30fps
7600GT not playable at those settings according to them

http://www.gamespot.com/features/6177688/p-6.html
Bioshock 1024x768 High Quality
8600GT 40fps
7600GT 25fps

http://www.gamespot.com/features/6183967/p-4.html
COD4 1280x1024 Max Quality
8600GT 36fps
7600GT 19fps

http://www.gamespot.com/features/6183499/p-4.html
UT3 1600x1200 Max Quality
8600GT 39fps
7600GT 19fps
 

unr3al

Senior member
Jun 10, 2008
214
1
81
www.link-up.co.za
Bump

Their benchmarks are completely screwed. Seriously. I can't get the owner of that 7600GT to run a benchmark; he seems to be a bit busy. However, I won't change my opinion. I know how Crysis runs on his system and he DEFINITELY doesn't get 16fps @ 800x600 medium IQ. And unplayable at 1024x768 is just sad. According to them the HD2400XT beats the 7600GT. Now that's a joke. Real funny... And the X1650XT (tell me you know that the 7600GT is a faster card?) gets twice the framerate that the 7600GT gets at 800x600 medium quality?

Look at the X1950pro's results. 1024x768 medium IQ and 33fps?? Come on! I get hardly less than that. And nobody is going to tell me that my card is practically just as fast as an X1950pro. How about the 2600XT? 29fps. Cool, less than I get. I must have some magical system... Anyone with half a brain could tell you that there was something seriously wrong with those benchmarks!

EDIT

The X1300XT beating the 7600GT as well??? By 6fps? Or, according to their performance sheet, by nearly 38%. No way in hell.

Just to round off this topic, here are my maxed VPU OC results, done with ATi Tray Tools: Core/Mem - from 630MHz/800MHz to 736MHz/918MHz. Any higher and I start getting artefacts left, right and center. My temps are great as well, staying in the mid 60's at load. On a passive cooler.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: unr3al
Bump

Their benchmarks are completely screwed. Seriously. I can't get the owner of that 7600GT to run a benchmark; he seems to be a bit busy. However, I won't change my opinion. I know how Crysis runs on his system and he DEFINITELY doesn't get 16fps @ 800x600 medium IQ. And unplayable at 1024x768 is just sad. According to them the HD2400XT beats the 7600GT. Now that's a joke. Real funny... And the X1650XT (tell me you know that the 7600GT is a faster card?) gets twice the framerate that the 7600GT gets at 800x600 medium quality?

Look at the X1950pro's results. 1024x768 medium IQ and 33fps?? Come on! I get hardly less than that. And nobody is going to tell me that my card is practically just as fast as an X1950pro. How about the 2600XT? 29fps. Cool, less than I get. I must have some magical system... Anyone with half a brain could tell you that there was something seriously wrong with those benchmarks!

EDIT

The X1300XT beating the 7600GT as well??? By 6fps? Or, according to their performance sheet, by nearly 38%. No way in hell.

Just to round off this topic, here are my maxed VPU OC results, done with ATi Tray Tools: Core/Mem - from 630MHz/800MHz to 736MHz/918MHz. Any higher and I start getting artefacts left, right and center. My temps are great as well, staying in the mid 60's at load. On a passive cooler.
what so hard to understand? newer shader intesive games make cards like the 7600gt look like a joke to even the pokey cards like 8600gt. I had a 7600gt before the 8600gt and I know their benchmarks are right in line with the numbers I got. everybody calls a benchmark bs or fake when it doesnt agree with what they "think" the results should be.

 

unr3al

Senior member
Jun 10, 2008
214
1
81
www.link-up.co.za
Their benchmarks ARE BS because a 7600GT doesn't get the framerates they got. How hard is THAT to understand? Should I type in bold caps? Sheesh! If I had access to that 7600GT I would have benched it myself just to prove it to you. The owner of the system runs Crysis on a 22" LCD at 1680x1050 (albeit low IQ but with gamespot's reported framerates it should be everything but possible) and he gets completely playable framerates. If your 7600 got the same framerates then you should look into the case...

Did you even READ the reply I posted? The X1300XT (which gets beaten by the 6600GT in most benchmarks) is shown to be substancially faster than the 7600GT. And the X1300XT is by NO means more advanced, from a newer generation or whatever than the 7600GT. I have seen more graphics card reviews and benchmark comparisons than any other review/comparison and I can assure you that gamespot's numbers are off. It's not my card I'm defending here, why would I post a load of bull? But its easy for you to try and justify your purchase of an 8600GT...

Here are some benches just to show how off they really are. Look at those charts, so much for 8600GT=2x7600GT...
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: unr3al
Their benchmarks ARE BS because a 7600GT doesn't get the framerates they got. How hard is THAT to understand? Should I type in bold caps? Sheesh! If I had access to that 7600GT I would have benched it myself just to prove it to you. The owner of the system runs Crysis on a 22" LCD at 1680x1050 (albeit low IQ but with gamespot's reported framerates it should be everything but possible) and he gets completely playable framerates. If your 7600 got the same framerates then you should look into the case...

Did you even READ the reply I posted? The X1300XT (which gets beaten by the 6600GT in most benchmarks) is shown to be substancially faster than the 7600GT. And the X1300XT is by NO means more advanced, from a newer generation or whatever than the 7600GT. I have seen more graphics card reviews and benchmark comparisons than any other review/comparison and I can assure you that gamespot's numbers are off. It's not my card I'm defending here, why would I post a load of bull? But its easy for you to try and justify your purchase of an 8600GT...

Here are some benches just to show how off they really are. Look at those charts, so much for 8600GT=2x7600GT...
I guess we can do this all day. Look at the settings they used for most games. The 7600gt got ZERO while the 8600gt got low single digits. Those kind of numbers through off the total. If you actually look at the chart you will see the 8600gt beating the crap out of the 7600gt in Call of Duty, Mass Effect and Crysis. So yes at playable settings for each card the 8600gt is "almost" twice as fast in most modern games.

I have owned BOTH cards so I know from experience too. If you "think" that the 7600gt was playable in Crysis at 1024x768 on medium then knock yourself out. There is NOT one benchmark on the web that will back that up though.

 

unr3al

Senior member
Jun 10, 2008
214
1
81
www.link-up.co.za
Where the 7600GT got "zero", the settings for Crysis or whatever was Very High, aka DX10. So it wasn't as much zero as N/A. I rest my case...
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: unr3al
Where the 7600GT got "zero", the settings for Crysis or whatever was Very High, aka DX10. So it wasn't as much zero as N/A. I rest my case...
you missed the point. I was saying that in a lot of those benchmarks they were testing at settings that were so high that only single digits seperated them. those single digit scores are usless and make the overall fps look closer than what they really are. at playable settings the 8600gt usually had a big increase over the 7600gt in most games.

 

FalseChristian

Diamond Member
Jan 7, 2002
3,322
0
71
Let's get down to the nitty gritty of it all. The 7600GT and the X1300XT and the 8600GT etc. are absolutely craptacular if you want to play Crysis. GLQuake? Ok. I had 2 7950GTs in SLI that played Crysis like crap so I got me 2 8800GTs and Crysis plays acceptably at 1280x1024 on 'High' without AA or AF.:thumbsup:
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: FalseChristian
Let's get down to the nitty gritty of it all. The 7600GT and the X1300XT and the 8600GT etc. are absolutely craptacular if you want to play Crysis. GLQuake? Ok. I had 2 7950GTs in SLI that played Crysis like crap so I got me 2 8800GTs and Crysis plays acceptably at 1280x1024 on 'High' without AA or AF.:thumbsup:
no the 8600gt is just fine for medium and couple high settings at 1024x768 or all medium at 1280x1024 where the 7600gt isnt. yes an 8600gt sucks compared to the 8800gt but tears the 7600gt a new one in most modern games. sadly that doesnt mean much though as upcoming games will reduce the 8600gt to all low settings. lol.


also you would probably get the exact same framerates with one 8800gt at 1280 as you would with 2 cards. in this review at the slightly lower res of 1280x800 a single 8800gt is faster than 8800gt sli. http://techreport.com/articles.x/14168/8