• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

How does the 8800 scale in games on AMD platforms?

CaiNaM

Diamond Member
GeForce 8800 GTX/8800 GTS Performance with Athlon 64

cpu speed obviously does matter:

"In fact if you?ve got an X2 3800+ running at stock speeds, there?s no point in upgrading to a GeForce 8800 GTX, as there were often cases where the 3800+/8800 GTX combination were outrun by the 8800 GTS and a faster CPU like the 4200+ or 4600+."

they also try and simulate performance differences between 512 & 1024 L2 cache.
 
I don't know why they are testing with the FX-62 when the X2 5600+ is available and identical. Where are the X2 6000+ results?

I agree that for socket 939 though, there is probably no point in more than a GTS unless you have a honking great LCD you need to run at native res.
 
Originally posted by: Gstanfor
I don't know why they are testing with the FX-62 when the X2 5600+ is available and identical. Where are the X2 6000+ results?

I agree that for socket 939 though, there is probably no point in more than a GTS unless you have a honking great LCD you need to run at native res.

Because the article is from November 22nd 2006. So the 6000+ didn't exist at that time, and FX-62 is identical to a 5600+ in performance anyway so that's just a minor nitpick.
 
cpu speed obviously does matter:
I've been saying this for years.

While the GPU is certainly more important too many people look at one benchmark and don't realize the CPU will play an important role when you have a whole bunch of guys or vehicles on the screen.

When I moved to E6600 from my A64 4000 I saw almost a 50% performance gain in Vampire Bloodlines.
 
Jesus..
The 8800GTX doesn't even get GPU bound in most of the games until 1920x1200.
That thing is out of control. 😉
 
If they were to do that review now i'd expect to see Supreme Commander & ArmA: Armed Assault on test as they both stress the CPU and would show CPU scaling better.
 
This (albeit old) article does a good job of showing the importance of many things when it comes to gaming performance-- 1) the video card 2) the CPU 3) the resolution 4) the [type of] game.

Because performance can vary interchangeably with any of those 4 points, I don't see how they can make statements like "In fact if you?ve got an X2 3800+ running at stock speeds, there?s no point in upgrading to a GeForce 8800 GTX". Since, obviously, their own benchmarks point to many reasons why one would want to, such as playing at 19x12 resolution, or even playing at 16x12 on everything except for LOMAC, HL2, and DMoMM. I am assuming the paragraph where that quote came from in the conclusion is a direct coorelation to the previous paragraph where they specifically talk about those CPU-stressful games.

Other than those 3 games, every other game they tested showed a significant gain going from a stock X2 3800+ with a GTS to a stock X2 3800+ with a GTX at 1600x1200. More than 30% at times. While going from a 3800+ to an FX62 showed 0%.

Since a lot of gamers use 19" LCDs at 1280x1024, this data is extremely relevant, as they need to carefully match their GPU with the right CPU in order to obtain the best results. Those who game at higher resolutions can see those charts where at 16x12 and 19x12, a stock X2 3800+ provides the same fps as an FX-62 (with an exception here and there). I would definitely spring for a GTX over an FX62 if I play at those resolutions.
 
It would have been much more helpful if they had tested Oblivion in a town, where most benchmarks have show to be the most CPU-limited.
 
Is that article saying that its best to always have the fastest FPS when gaming?. As deadseasquirrel inferred, its cool to have the free eye candy when playing on 19" LCD's
 
I'm looking at those benchmarks, and I'm seeing the games I care about (FEAR, Oblivion) being entirely GPU bound. The results between the different cpus are within the margin of error.

Quake4 is multithreaded, same deal.

Even HL2 engine games seem to be all about the GPU. Or at least moreso then cpu power.

I'm sorry but this really doesn't change my perspective at all. Sure at really low resolutions the cpu comes into play...but this is the age of LCDs, anything below 1280x1024 basically doesn't exist anymore.
 
From what I can see, IMO every CPU is pretty much good enough to provide decent frames for all those games, so does it really matter if they're CPU bound??
 
Would have been a little more useful if they included C2ds also. All that article really told me is whether I will notice it if I OC'ed my K8 or left it at stock. :roll:
 
Would be more useful had they tested higher resolutions...1680x1050 is extremely popular, and 1920x1200 is going to become even more popular in the next several months when those 24" lcds drop in price...
 
Good review, but in the newer titles, they basically prove that the CPU doesn't matter much.

Of course @ 1280x1024, things are CPU-bound w/8800s...not suprising.

But aside from older games, GPU is still far more important.

So IMO, the review was kinda ironic, since it pretty much proved what we already know.

 
Originally posted by: aka1nas
Would have been a little more useful if they included C2ds also. All that article really told me is whether I will notice it if I OC'ed my K8 or left it at stock. :roll:

Not just included C2D's but also Single Core CPU's.
 
Back
Top