Video Card Decision (**Update**)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: toyota
I am NOT disagreeing with you that the numbers look off compared to the clockspeed. all I am saying is that if techreport got those numbers with a better cpu then my numbers arent far fetched. I gave you the numbers at 3 different cpu speeds from the Crysis benchmark that i have no control over.

But you've been arguing how your benchmarks aren't flawed.

You logic is flawed too. You want to compare different testing method, settings, resolution, etc... to your minimum frame rate because it has similar clock speeds based on the same architecture. And FYI that game at techreport is Crysis Warhead. Were you not testing Crysis 1.2? You want to compare because it has same engine?

What you should be doing is comparing the CPU's in techreport article.

Here I give you an example from their results...

E8600 3.33ghz
min: 30fps
avg: 55.1fps

Q6600 @ 2.4ghz 28%
min: 23fps 24%
avg: 38.8fps 30%

Now they have different cache and more tweaks on wolfdale core but the results look legit because the benches are very close to clock rate reduction. There's a margin of error because they used fraps too.

Second of all this is where the GPU is less stressed @ 1024x768 medium so CPU makes quite more difference here than say 1680x1050 high quality.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
Originally posted by: toyota
I am NOT disagreeing with you that the numbers look off compared to the clockspeed. all I am saying is that if techreport got those numbers with a better cpu then my numbers arent far fetched. I gave you the numbers at 3 different cpu speeds from the Crysis benchmark that i have no control over.

But you've been arguing how your benchmarks aren't flawed.

You logic is flawed too. You want to compare different testing method, settings, resolution, etc... to your minimum frame rate because it has similar clock speeds based on the same architecture. And FYI that game at techreport is Crysis Warhead. Were you not testing Crysis 1.2? You want to compare because it has same engine?

What you should be doing is comparing the CPU's in techreport article.

Here I give you an example from their results...

E8600 3.33ghz
min: 30fps
avg: 55.1fps

Q6600 @ 2.4ghz 28%
min: 23fps 24%
avg: 38.8fps 30%

Now they have different cache and more tweaks on wolfdale core but the results look legit because the benches are very close to clock rate reduction. There's a margin of error because they used fraps too.

Second of all this is where the GPU is less stressed @ 1024x768 so CPU makes quite more difference here than say 1680x1050 high quality.

well see edited post. now i dont want to trust the built in benchmark because that was a lot of trouble for nothing. in the end though using the fraps benchmark more than proved what I originally said in that he would need to oc that cpu with a gtx260 or 4870 to get the most out of it. again whatever you are getting for a minimum at 1024x768 and medium settings will only stay the same or go down with higher res and settings.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
You might want to read what I said from the very start.

All GPU's need something faster feeding it doesn't mean a core 2 duo @ 2.33ghz will not feed enough for a 4870 or 4850. In reality your E8500 is only faster when the constraint is less on the GPU and isn't much faster as you up the resolution compared to E6550 to a point where it becomes GPU limited. People do not play 1280x1024 with no AA. If anything people play in highest resolution their monitor supports with all the AA the card can handle. In reality your 33% better clock speeds is probably 0-10% faster at the settings you play over E6550.

As for your GTX 285 SLI well that kind of setup you put put more constraint on CPU considering GPU limitations have been lifted in most of the games out there. So CPU matters in those situations.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
You might want to read what I said from the very start.

All GPU's need something faster feeding it doesn't mean a core 2 duo @ 2.33ghz will not feed enough for a 4870 or 4850. In reality your E8500 is only faster when the constraint is less on the GPU and isn't much faster as you up the resolution compared to E6550 to a point where it becomes GPU limited. People do not play 1280x1024 with no AA. If anything people play in highest resolution their monitor supports with all the AA the card can handle. In reality your 33% better clock speeds is probably 0-10% faster at the settings you play over E6550.

As for your GTX 285 SLI well that kind of setup you put put more constraint on CPU considering GPU limitations have been lifted in most of the games out there. So CPU matters in those situations.

and all i said was that if he went with a high card that he wouldnt see all the benefit of it without getting that cpu up to snuff which was true. as usual you dragged a thread out for what was nothing more than reasonable advice I gave the op. every game and situation is different but the cpu still remains quite important. this is even more so when upgrading to a high end gpu and wanting to reap ALL of its performance.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
I've been saying it from the beginning of this thread. At the desired settings a gamer plays the difference is minimal with faster CPU long as you aren't bottlenecked to a point where the cpu isn't strong enough to pull 60fps. This has been proven time and again from dozens of articles available on the web.

CPU is part of the equation but it's been getting less and less important in the gaming market.

I've had my CPU for nearly 3 years. I can't say the same thing about my GPU. it's constantly changing every year.
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
O.P., if you're still reading I bought a GTX 260 (factory OC'd at 666Mhz) to replace my 8800 GTX. My CPU is an Athlon X2 6400+ and I have a 1680x1050 monitor. My 3dmark06 score went from 11250 to 12350 - nothing earth-shattering. In real world gaming - WoW, LotRO, Oblivion, Fallout 3, etc... - the FPS difference was negligible. Not saying to buy something else - the GTX 260 is a fine card at a good price, but if you most likely won't make full use of its power. Compared to what you're using it will be a huge difference though. For me it was a waste of money.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
I've been saying it from the beginning of this thread. At the desired settings a gamer plays the difference is minimal with faster CPU long as you aren't bottlenecked to a point where the cpu isn't strong enough to pull 60fps. This has been proven time and again from dozens of articles available on the web.

CPU is part of the equation but it's been getting less and less important in the gaming market.

I've had my CPU for nearly 3 years. I can't say the same thing about my GPU. it's constantly changing every year.

when it comes to Crysis yes it did make a difference and Im sure at more intense and demanding parts of the game it would be even more noticeable. Far Cry 2 had an even bigger impact as well.

actually the cpu is getting more important in some games. you need to check out some cpu charts for many games because they have a wider range than the gpus in some cases. heck even though its playable with average hardware even FEAR 2 shows massive improvements by having a faster cpu. this years high end gpus are going to really need some cpu power to keep them fed as even current gtx285 sli setups can be bottlenecked without a fast overclocked quad core cpu.

your cpu was very good(especially when overclocked) when it came out much like the 8800gtx is still a decent video card today. those with slower X2 cpus are already getting sometimes half the performance of a high end Core 2. there are tons of people with really slow cpus that keep upgrading their gpu without even realizing that their cpu is part of the problem.

its all about balance and the cpu is still a very important part of the equation. minimum framerates can kill a game and if the cpu is the culprit then turning down settings isnt going to help.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Originally posted by: josh6079

The only games that made me concerned about the CPU and its number of cores is Quake 4 and ETQW since they are both multithreaded applications.
You're wasting your time worrying about quad core; the vast majority of games still show little to no performance gain from them. If you?re upgrading it?s much better to get a faster clocked dual core variant.

Quake 4 definitely does not scale past two cores and I don?t think Quake Wars does either.

BFG - your build is too, albeit, a better CPU, but I wonder if we could exchange time demos or have you run the HoC benchmark utility.
I don?t distribute my demos, but I?ll post numbers from the HoC benchmark if I have time.

I'm assuming the demo you did your tests in must be about as demanding as the game can get, and therefore the most accurate. Your numbers just seem lower than what I've been reading those cards get in ETQW.
Yes, it has a lot of particle effects. I like designing tough demos because I want to know about the more demanding scenarios, rather than getting lulled into a false sense of security from easy demos.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: toyota
Originally posted by: Azn
I've been saying it from the beginning of this thread. At the desired settings a gamer plays the difference is minimal with faster CPU long as you aren't bottlenecked to a point where the cpu isn't strong enough to pull 60fps. This has been proven time and again from dozens of articles available on the web.

CPU is part of the equation but it's been getting less and less important in the gaming market.

I've had my CPU for nearly 3 years. I can't say the same thing about my GPU. it's constantly changing every year.

when it comes to Crysis yes it did make a difference and Im sure at more intense and demanding parts of the game it would be even more noticeable. Far Cry 2 had an even bigger impact as well.

actually the cpu is getting more important in some games. you need to check out some cpu charts for many games because they have a wider range than the gpus in some cases. heck even games like FEAR 2 show massive improvements by having a decent cpu. this years high end gpus are going to really need some cpu power to keep them fed as even current gtx285 sli setups can be bottlenecked without a fast overclocked quad core cpu.

your cpu was VERY good when it came out much like the 8800gtx is still a decent video card. those with slower X2 cpus are already getting sometimes half the performance of a good Core 2.

its all about balance and the cpu is still a very important part of the equation. minimum framerates can kill a game and if the cpu is the culprit then turning down settings isnt going to help.

WTF are you talking about? my benchmarks showed 12% better frames from 33% less clock rate when comparing at the desired settings. There are CPU intensive games out there consisting of mostly strategy but majority of the games out there are GPU limited than anything else. A GPU will always be better upgrade for gaming than a CPU long as the CPU wasn't an ancient.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Leyawiin
O.P., if you're still reading I bought a GTX 260 (factory OC'd at 666Mhz) to replace my 8800 GTX. My CPU is an Athlon X2 6400+ and I have a 1680x1050 monitor. My 3dmark06 score went from 11250 to 12350 - nothing earth-shattering. In real world gaming - WoW, LotRO, Oblivion, Fallout 3, etc... - the FPS difference was negligible. Not saying to buy something else - the GTX 260 is a fine card at a good price, but if you most likely won't make full use of its power. Compared to what you're using it will be a huge difference though. For me it was a waste of money.

Probably a side upgrade. You gain a resolution or AA settings. That's about it. GTX series isn't that much faster how some people make it out to be but the OP is upgrading from a 1950xtx.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Originally posted by: Leyawiin

O.P., if you're still reading I bought a GTX 260 (factory OC'd at 666Mhz) to replace my 8800 GTX. My CPU is an Athlon X2 6400+ and I have a 1680x1050 monitor. My 3dmark06 score went from 11250 to 12350 - nothing earth-shattering. In real world gaming - WoW, LotRO, Oblivion, Fallout 3, etc... - the FPS difference was negligible.
That's because 1680x1050 is a low resolution, and you were likely gaming without AA.

If you gamed with high levels of AA then you'd notice a much bigger difference. The whole point of high-end cards is to crank the eye candy while maintaining a playable framerate.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
Originally posted by: toyota
Originally posted by: Azn
I've been saying it from the beginning of this thread. At the desired settings a gamer plays the difference is minimal with faster CPU long as you aren't bottlenecked to a point where the cpu isn't strong enough to pull 60fps. This has been proven time and again from dozens of articles available on the web.

CPU is part of the equation but it's been getting less and less important in the gaming market.

I've had my CPU for nearly 3 years. I can't say the same thing about my GPU. it's constantly changing every year.

when it comes to Crysis yes it did make a difference and Im sure at more intense and demanding parts of the game it would be even more noticeable. Far Cry 2 had an even bigger impact as well.

actually the cpu is getting more important in some games. you need to check out some cpu charts for many games because they have a wider range than the gpus in some cases. heck even games like FEAR 2 show massive improvements by having a decent cpu. this years high end gpus are going to really need some cpu power to keep them fed as even current gtx285 sli setups can be bottlenecked without a fast overclocked quad core cpu.

your cpu was VERY good when it came out much like the 8800gtx is still a decent video card. those with slower X2 cpus are already getting sometimes half the performance of a good Core 2.

its all about balance and the cpu is still a very important part of the equation. minimum framerates can kill a game and if the cpu is the culprit then turning down settings isnt going to help.

WTF are you talking about? my benchmarks showed 12% better frames from 33% less clock rate when comparing at the desired settings. There are CPU intensive games out there consisting of mostly strategy but majority of the games out there are GPU limited than anything else. A GPU will always be better upgrade for gaming than a CPU long as the CPU wasn't an ancient.

it depends on what you think is ancient. a 3800 X2 is pretty slow but would be fine with a gpu like a 9600gt even though it would hold that back in some games especially in minimum framerates. upgrading that same 38000 X2 system to a 4870 or gtx280 would be almost silly because the minimum framerates in many newer games wouldnt even budge over the 9600gt. that person would be at the point of needing a better cpu if they wanted the most out of a high end gpu. heck I kept trying different cards with my 5000 X2 and none of them could give me better minimum framerates in many games than the wimpy 8600gt could. the pc I have now is decent and allows me to actually get the performance that a good video card is capable of.

FEAR 2 is not all that demanding but with a gtx280 even it gets a 50% performance increase going from a 5000 X2 to an e8400(which is not very high end anymore). just think how many people have much worse cpus than even that. http://www.pcgameshardware.com...tings-compared/?page=3
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
Originally posted by: BFG10K
Originally posted by: Leyawiin

O.P., if you're still reading I bought a GTX 260 (factory OC'd at 666Mhz) to replace my 8800 GTX. My CPU is an Athlon X2 6400+ and I have a 1680x1050 monitor. My 3dmark06 score went from 11250 to 12350 - nothing earth-shattering. In real world gaming - WoW, LotRO, Oblivion, Fallout 3, etc... - the FPS difference was negligible.
That's because 1680x1050 is a low resolution, and you were likely gaming without AA.

If you gamed with high levels of AA then you'd notice a much bigger difference. The whole point of high-end cards is to crank the eye candy while maintaining a playable framerate.

No, I used full quality on almost every game I play and either 4X or 8X AA & AF. Right now I'm pretty immersed in Fallout 3. I use exactly the same settings with the 8800 GTX and GTX 260 (4X AA, 8X AF, all quality sliders and options maxed) and the gameplay is barely smoother with the GTX 260 - to the point of being almost unnoticeable unless I'm watching very closely. I haven't benchmarked it with FRAPS, but they must be withing 5 FPS of each other.

CPU bottlenecking.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: toyota
it depends on what you think is ancient. a 3800 X2 is pretty slow but would be fine with a decent gpu like a 9600gt. upgrading that same system to a 4870 or gtx280 would be almost silly because the minimum framerate in most newer games wouldnt even budge. that person would be at the point of needing a better cpu if they wanted the most out of a high end gpu. heck I kept trying different cards with my 5000 X2 and none of them could give me better minimum framerates in many games than the wimpy 8600gt could. the pc I have now is good and allows me to actually get the performance that a decent video card is capable of.

FEAR 2 is not all that demanding but with a gtx280 even it gets a 50% performance increase going from a 5000 X2 to an e8400(which is not very high end anymore). just think how many people have much worse cpus than even that. http://www.pcgameshardware.com...tings-compared/?page=3


x2 3800 is the first dual core from AMD. It was great when it released 5 years ago now it's showing it's age. It can't be no slower than E2160. Anyways. Fear 2 engine is rather a old engine. @ 1600x1200 no AA it's barely stressing the GTX 280. So x2 5000 is unplayable? Is the E8400? You obviously play Fear 2 1600x1200 no AA with GTX 280. Would you play those settings with you GTX260? I'm guessing not. The results of those benches after you add AA and maybe run higher resolutions become much smaller but you seem to have a hard time understanding how that logic works.

I think you have a habit of replying like you know it all even providing flawed benchmarks and argue until you've finally realized how stupid you sound and still argue. Maybe you should look at the picture you like to post every time I say something that threatens your intelligence. Truth hurts.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
Originally posted by: toyota
it depends on what you think is ancient. a 3800 X2 is pretty slow but would be fine with a decent gpu like a 9600gt. upgrading that same system to a 4870 or gtx280 would be almost silly because the minimum framerate in most newer games wouldnt even budge. that person would be at the point of needing a better cpu if they wanted the most out of a high end gpu. heck I kept trying different cards with my 5000 X2 and none of them could give me better minimum framerates in many games than the wimpy 8600gt could. the pc I have now is good and allows me to actually get the performance that a decent video card is capable of.

FEAR 2 is not all that demanding but with a gtx280 even it gets a 50% performance increase going from a 5000 X2 to an e8400(which is not very high end anymore). just think how many people have much worse cpus than even that. http://www.pcgameshardware.com...tings-compared/?page=3


x2 3800 is the first dual core from AMD. It was great when it released 5 years ago now it's showing it's age. It can't be no slower than E2160. Anyways. Fear 2 engine is rather a old engine. @ 1600x1200 no AA it's barely stressing the GTX 280. So x2 5000 is unplayable? Is the E8400? You obviously play Fear 2 1600x1200 no AA with GTX 280. Would you play those settings with you GTX260? I'm guessing not. The results of those benches after you add AA and maybe run higher resolutions become a much smaller but you seem to have a hard time understanding how it works.

I think you have a habit of replying like you know it all even providing flawed benchmarks and argue until you've been finally realized how stupid you sound. Maybe you should look at the picture you like to post every time I say something that threatens your intelligence. Truth hurts.

well I already said FEAR 2 was not all that demanding but wanted to show that even at 1680 on max settings that differences between the cpus are quite large. they didnt even have a lot of the much slower cpus that many people have even on there.


well that screencap of the PM you sent me http://img212.imageshack.us/my...?image=11581240sy6.jpg showed just what kind of person you are. you are pathetic and get your kicks out of being argumentative even if the other person has a valid point. you like to try to feel superior by pointing out flaws with people and what they say. you making ignorant and childish comments and insulting to me through a PM just goes to show that you have some real issues that need to be worked out. Im sure you will be back to get in the last word and perhaps twist what I have said to feed your ego so good luck with that.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: toyota
well I already said FEAR 2 was not all that demanding but wanted to show that even at 1680 on max settings that differences between the cpus are quite large. they didnt even have a lot of the much slower cpus that many people have even on there.
Your analysis is spot-on, its so blatantly obvious CPUs are significant bottlenecks even with fast single-GPUs and its even more obvious that there's almost no point at all in going SLI or CF without pairing them with the fastest overclocked CPU solutions available. Jensen was clearly wrong here, the CPU isn't diminishing in importance for games, if anything he needs faster CPUs more than ever to drive his GPUs.

Even in that FEAR 2 example, its clearly obvious the CPU benefits and scales as much as a faster GPU to the point where a faster CPU with a slower GPU results in the same or better frame rates as the faster GPU paired with a slower CPU:

55.6 FPS = 1680x0AA GTX 280 + E6420 @ 2.13GHz
92.9 FPS = 1680x0AA GTX 285 + E8400 @ 3.6GHz

66.5 FPS = 1680x0AA 8800 Ultra + E8400 @ 3.6GHz
56.1 FPS = 1680x4AA 8800 Ultra + E8400 @ 3.6GHz

Its blatantly obvious what's going on here, there's heavy CPU bottlenecking in modern games to the point you absolutely need faster CPUs to maximize performance of single-GPU solutions. Its so prevalent that a slower GPU will outperform a faster GPU if the slower GPU is paired with a faster CPU. Its also obvious CPU bottlenecking is still an issue at higher resolutions with AA as slow CPUs result in lower max FPS than potential FPS with faster CPUs even with AA enabled.

GTA4 - 13 CPU round-up

COD4 + GRiD - Intel CPU Clock for Clock Comparison @ 2GHz

COD5 - 12 Intel and AMD CPUs

Far Cry 2 - various speeds

Left 4 Dead - various speeds

Fallout 3

Most of those reviews are done at low resolutions without AA, however, they clearly show scaling with faster CPUs with fast single-GPU solutions. The most important thing to take away from these graphs if you have a slower CPU is that if you're seeing 30-40FPS and you think that's low, upgrading your GPU may not increase your FPS at all. You can upgrade your GPU and crank up AA to 4x or 8x and max out all settings with minimal penalty but you may still see the same low FPS and as Toyota said, lower minimums as you are CPU bottlenecked.

Its amazing some people still refuse to acknowledge these facts when they're readily available in published benchmarks and easily tested and verified. Again, Toyota provided some clear examples in Crysis, which is easily one of the most consistent and easily replicated benchmarks around. CPU speed and multi-threading are going to become increasingly important as has already been shown by both Nvidia and ATI's focus on recent performance boosts from multi-threaded drivers in DX10/Vista, benefits Derek mentioned would be fully realized with DX11.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Leyawiin
No, I used full quality on almost every game I play and either 4X or 8X AA & AF. Right now I'm pretty immersed in Fallout 3. I use exactly the same settings with the 8800 GTX and GTX 260 (4X AA, 8X AF, all quality sliders and options maxed) and the gameplay is barely smoother with the GTX 260 - to the point of being almost unnoticeable unless I'm watching very closely. I haven't benchmarked it with FRAPS, but they must be withing 5 FPS of each other.

CPU bottlenecking.
Yep, your results are consistent with published benches from around the world, Fallout 3 is heavily CPU bottlenecked or the engine just doesn't scale well to generate more FPS. You can see this clearly as the fastest overclocked Core i7 systems with 2/3/4 GPU SLI/CF configurations all log jam around the same FPS, you'll see its consistently between 70-80 FPS average regardless of the site as long as they're using the fastest platform available - Core i7.

You'll also see the fastest single-GPUs scale well with these faster CPUs as well until they start dropping off around 1920 with 4xAA or higher. Obviously spending more on a GPU if you have a slower CPU doesn't make any sense when it won't provide any benefit, especially when you can increase performance more by simply overclocking or upgrading your CPU.

Tech Report Fallout 3
Bit-Tech Fallout 3

You can compare to pretty much any site and it'll be similar FPS ranges with Core i7 with a significant drop in some games with slower architectures like Core 2 or Phenom 2.


 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
Biggest benefit I saw with my game's performance recently was getting a hardware sound card (I use Win XP). Took a big load off that 3.2 Ghz Windsor CPU (that can't really overclock anyway). Only game I benchmarked was Oblivion after installing it and my FPS went up anywhere from 5 to 15 depending on if I was around a crowd of NPCs or out in the woods.

More I think about it the O.P. could spend 50-75 dollars less and get the same performance (more or less) if he doesn't want to OC his CPU. 9800 GTX+ or HD 4850.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: chizow
Originally posted by: toyota
well I already said FEAR 2 was not all that demanding but wanted to show that even at 1680 on max settings that differences between the cpus are quite large. they didnt even have a lot of the much slower cpus that many people have even on there.
Your analysis is spot-on, its so blatantly obvious CPUs are significant bottlenecks even with fast single-GPUs and its even more obvious that there's almost no point at all in going SLI or CF without pairing them with the fastest overclocked CPU solutions available. Jensen was clearly wrong here, the CPU isn't diminishing in importance for games, if anything he needs faster CPUs more than ever to drive his GPUs.

Even in that FEAR 2 example, its clearly obvious the CPU benefits and scales as much as a faster GPU to the point where a faster CPU with a slower GPU results in the same or better frame rates as the faster GPU paired with a slower CPU:

55.6 FPS = 1680x0AA GTX 280 + E6420 @ 2.13GHz
92.9 FPS = 1680x0AA GTX 285 + E8400 @ 3.6GHz

66.5 FPS = 1680x0AA 8800 Ultra + E8400 @ 3.6GHz
56.1 FPS = 1680x4AA 8800 Ultra + E8400 @ 3.6GHz

Its blatantly obvious what's going on here, there's heavy CPU bottlenecking in modern games to the point you absolutely need faster CPUs to maximize performance of single-GPU solutions. Its so prevalent that a slower GPU will outperform a faster GPU if the slower GPU is paired with a faster CPU. Its also obvious CPU bottlenecking is still an issue at higher resolutions with AA as slow CPUs result in lower max FPS than potential FPS with faster CPUs even with AA enabled.

GTA4 - 13 CPU round-up

COD4 + GRiD - Intel CPU Clock for Clock Comparison @ 2GHz

COD5 - 12 Intel and AMD CPUs

Far Cry 2 - various speeds

Left 4 Dead - various speeds

Fallout 3

Most of those reviews are done at low resolutions without AA, however, they clearly show scaling with faster CPUs with fast single-GPU solutions. The most important thing to take away from these graphs if you have a slower CPU is that if you're seeing 30-40FPS and you think that's low, upgrading your GPU may not increase your FPS at all. You can upgrade your GPU and crank up AA to 4x or 8x and max out all settings with minimal penalty but you may still see the same low FPS and as Toyota said, lower minimums as you are CPU bottlenecked.

Its amazing some people still refuse to acknowledge these facts when they're readily available in published benchmarks and easily tested and verified. Again, Toyota provided some clear examples in Crysis, which is easily one of the most consistent and easily replicated benchmarks around. CPU speed and multi-threading are going to become increasingly important as has already been shown by both Nvidia and ATI's focus on recent performance boosts from multi-threaded drivers in DX10/Vista, benefits Derek mentioned would be fully realized with DX11.

ROFL... You even acknowledge it's running lower resolutions with no AA where it stresses the CPU a lot more. Who play these settings? Look at the results provided by your own links shrink as you turn up to more reasonable settings with the GPU tested.

4 fps difference in Crysis from 2.13ghz to 3.15ghz @ 1680x1050 high quality with errors in minimum frame rates. That's 10% difference from 30 some odd percent difference in clock rates. That's huge. might as well overclock your GPU instead of spending $200 more for a processor. :laugh:
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Leyawiin
Originally posted by: BFG10K
Originally posted by: Leyawiin

O.P., if you're still reading I bought a GTX 260 (factory OC'd at 666Mhz) to replace my 8800 GTX. My CPU is an Athlon X2 6400+ and I have a 1680x1050 monitor. My 3dmark06 score went from 11250 to 12350 - nothing earth-shattering. In real world gaming - WoW, LotRO, Oblivion, Fallout 3, etc... - the FPS difference was negligible.
That's because 1680x1050 is a low resolution, and you were likely gaming without AA.

If you gamed with high levels of AA then you'd notice a much bigger difference. The whole point of high-end cards is to crank the eye candy while maintaining a playable framerate.

No, I used full quality on almost every game I play and either 4X or 8X AA & AF. Right now I'm pretty immersed in Fallout 3. I use exactly the same settings with the 8800 GTX and GTX 260 (4X AA, 8X AF, all quality sliders and options maxed) and the gameplay is barely smoother with the GTX 260 - to the point of being almost unnoticeable unless I'm watching very closely. I haven't benchmarked it with FRAPS, but they must be withing 5 FPS of each other.

CPU bottlenecking.

Fallout 3 isn't as GPU intensive as it's out to be. At those settings a 8800gtx will handle that game fine. Not that GTX 260 is such a huge upgrade from 8800gtx to begin with.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
More I think about it the O.P. could spend 50-75 dollars less and get the same performance (more or less) if he doesn't want to OC his CPU. 9800 GTX+ or HD 4850.

I've wanted the GTX 260 as I believed it to be powerful enough to run xS modes in the games I play.

That said, here's a Q4 comparison I found: Link

Looks like even a GTX 260 is struggling to keep a playable performance. Granted, they didn't show what 8xS could get at those settings.

A 4850 does look attractive considering the weight that OpenGL games have on my library and the fact that it can run Adaptive AA in them. I wanted to compensate for that by running the GTX 260 at xS modes, but even they may not have the power to do so. It would hurt the wallet less, but while it provides Adaptive AA it lacks soft particles and better AF (however, only one of the id games I play offers soft particles).

Would an 8800 Ultra be better than a 9800 GTX+?

I don?t distribute my demos, but I?ll post numbers from the HoC benchmark if I have time.

Awesome, that'll help.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: josh6079
More I think about it the O.P. could spend 50-75 dollars less and get the same performance (more or less) if he doesn't want to OC his CPU. 9800 GTX+ or HD 4850.

I've wanted the GTX 260 as I believed it to be powerful enough to run xS modes in the games I play.

That said, here's a Q4 comparison I found: Link

Looks like even a GTX 260 is struggling to keep a playable performance. Granted, they didn't show what 8xS could get at those settings.

A 4850 does look attractive considering the weight that OpenGL games have on my library and the fact that it can run Adaptive AA in them. I wanted to compensate for that by running the GTX 260 at xS modes, but even they may not have the power to do so. It would hurt the wallet less, but while it provides Adaptive AA it lacks soft particles and better AF (however, only one of the id games I play offers soft particles).

Would an 8800 Ultra be better than a 9800 GTX+?

I don?t distribute my demos, but I?ll post numbers from the HoC benchmark if I have time.

Awesome, that'll help.

16xAA with quake4? Come on. It's an old game but it's not that old. It's a FPS you don't need that much AA to begin with.

If you like turning up AA ultra is a little better than 9800gtx+ other wise 9800gtx+ does have a little more umph than that ultra.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: josh6079
Would an 8800 Ultra be better than a 9800 GTX+?
Only in memory bandwidth limited situations. ( 103.7 GB/s vs 70.4 GB/s ) and vram limited cases 768 vs 512mb.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Azn
ROFL... You even acknowledge it's running lower resolutions with no AA where it stresses the CPU a lot more. Who play these settings? Look at the results provided by your own links shrink as you turn up to more reasonable settings with the GPU tested.
Yep, and many of them show higher resolutions with AA as well, the point which you continually miss however is that if you have 40 FPS at 1280 with a slower CPU, like a C2D at 2.13GHz and you're not happy with those frame rates, you're certainly not going to see higher FPS as you increase resolution and AA if you're already CPU bottlenecked at 1280. This is CPU scaling/bottlenecking basics though, and something I'm not going to argue with you about.

4 fps difference in Crysis from 2.13ghz to 3.15ghz @ 1680x1050 high quality with errors in minimum frame rates. That's 10% difference from 30 some odd percent difference in clock rates. That's huge. might as well overclock your GPU instead of spending $200 more for a processor. :laugh:
Yep, his results are pretty typical of what you'd expect from a title that is mostly GPU limited with occasional dips in FPS in scenes where the CPU is limiting frame rate. Again, Crysis is one of the most repeatable, even predictable gaming benchmarks around so questioning his results are laughable, especially when you're unwilling to post some of your own.

As for overclocking, I'm quite sure there's much better value gained from overclocking low and mid-range CPUs than with GPUs so that there's really no need to spend $200 more for a processor. If anything you'd have to put that towards a higher-end GPU as there's clearly more differentiation with regard to GPUs than CPUs in pricing and actual chip/card differences.