How bottlenecked will this be?

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

toyota

Lifer
Apr 15, 2001
12,957
1
0
His CPU won’t bottleneck his 5850 if he runs at high enough detail levels (say 1920x1200 with 4xAA) because his GPU will bottleneck him the most, by far. His CPU is roughly equal to mine @ 2 GHz, and I can say that pairing mine @ 2 GHz with a Fermi/5870 would create a massive performance gain over my current GTX285.

Heck, I might even benchmark Fermi with my CPU @ 2 GHz just to prove my point, yet again.

Where are the benchmark numbers? At what settings?

In the meantime, here’s another article where they compared an i7 920 at 2 GHz and 4 GHz with a 5970, and in all cases they got little to no performance change relative to the CPU being underclocked by 50%.

http://www.legionhardware.com/document.php?id=869&p=2

Incidentally they started off with Batman, and they didn’t even use any AA when they could’ve easily used 2xAA or 4xAA. Also Far Cry 2, Wolfenstein and Crysis are in there too, and their results back my claims.

first off I do see your point that going with a very high end gpu will almost always provide better average framerate and/or allow you to use higher settings. the line has to be drawn somewhere though and a cpu like the 5600 X2 is well below that line in many games with a gpu like the 5850. see if you can comprehend these numbers because if you can then you will see my point which is that a 5600 X2 would hold a 5850 back by quite a bit.

you dont seem to acknowledge that if the cpu is only capable of providing a certain minimum framerate that having an even higher card wont make much difference in playability in most cases. in other words if all you are getting is 15-20 fps for a minimum framerate then why buy a really high end gpu that wont help in that respect? if a 5600 X2 can hold my gtx260 to these levels then a 5850 would be an even bigger waste for many games.


RE 5 1920x1080 4x AA fixed bench

cpu 3.48 gpu 620/1296/2160
51.8 fps

cpu 3.48 gpu 465/971/1620
51.0 fps


cpu 3.16 gpu 620/1296/2160
49.6 fps

cpu 3.16 gpu 465/971/1620
49.4 fps


cpu 1.80 gpu 620/1296/2160
31.5 fps

cpu 1.8 gpu 465/971/1620
31.8 fps

I am sure you will come up with some excuse to dispute those results. oh and just for the heck of it I went back and ran RE5 with my cpu at 3.48. I would get an even bigger boost with a nice quad core in this game. for RE5 my much slower gtx260 and any decent Core 2 duo or better would rape a 5600 X2 and 5850 setup.

EDIT: reran the RE5 numbers since I had vsync on by accident. I also ran 4x AA just to make it more gpu intensive so maybe I will get less excuses. I doubt it though.



Red Faction Guerrilla 1920x1080 highest settings no AA just running a fraps loop with no enemies and little destruction

cpu 3.16 gpu 666/1392/2200
Min, Max, Avg
21, 40, 30.310

cpu 1.80 gpu 666/1392/2200
Min, Max, Avg
12, 35, 24.862

that was not even doing any real destruction and with zero enemies on the screen to keep runs more consistent. with heavy action the framerates even dipped into the single digits with cpu at 1.8. yes I didnt use any AA because I left the game on the settings I normally use. the point is to see how much of a loss using the 5600 X2 would cause. again turning on AA doesnt mean squat when the game is borderline playable during action because of the cpu anyway.


Batman highest settings, physx high and 2x AA fraps run

cpu 3.16 gpu 666/1392/2200
Min, Max, Avg
23, 70, 43.122

cpu 1.80 gpu 666/1392/2200
Min, Max, Avg
16, 59, 35.975

having physx on high along with 2x AA and the other setting should make this very gpu limited. as you can see though having the cpu a equivalent to his 5600 X2 really brought the minimum framerate down to very sluggish levels.

anybody that thinks a 5600 X2 is decent enough to match it with a 5850 obviously doesnt care about getting the most out of their systems. a cpu like that will keep minimums fairly low and 5850 doesnt help you when its the cpu limiting how much playability and usefulness the card can deliver.

the OP asked if a 5600 X2 would be bottleneck a 5850 and the honest answer is yes and IMO quite badly in cpu intensive games. he was smart enough to realize that upgrading to a modern i5 would be the best overall move when running a card like that.

if you want to spend over 300 bucks to turn on more AA while still getting shitty low minimum framerates compared to what you could be getting with a more appropriate cpu then knock yourself out.
 
Last edited:

AzN

Banned
Nov 26, 2001
4,112
2
0
No, not at all, because some of you still don't understand that bottlenecking is application dependent.

I don't see that it does. In higher resolutions the bottleneck shifts to the GPU at least until GPU have enough breathing room for the CPU to take up that bottleneck. It just hasn't happened yet at least as resolution go higher and higher over time.

Application specific like bad console ports, MMO and such. We already went through this.
 
Last edited:

AzN

Banned
Nov 26, 2001
4,112
2
0
its not a theory at all. if I run my cpu at 1.8 its as fast as his 5600 x2. when I do that even at 1920 with AA it results in noticeably lower minimum and lower average framerates compared to using it at 3.16. heck some cpu very dependent games even become almost unplayable at times even with my cpu at 1.8. also SEVERAL games are FASTER with my gpu at 465 and cpu at 3.16 then with my gpu at 666 and cpu at 1.8 and thats a FACT.

common sense will tell you that if I am getting 15 fps for a minimum framerate because of my cpu then using a 5850 isnt going to change that. a 5600 X2 would limit a 5850 especially in the minimum framerate department regardless of what you think.

those i7 results you are looking at dont remotely compare to his 5600 X2. an i7 even at lower speeds can handle high end and multi gpus much better than even the Phenom 2 X4. an old 5600 X2 would pale in comparison if it was included.

Common sense? You were trying to say a faster CPU and 8600gt would let you play Crysis at high settings in your previous endeavors.

Now some of us who's been around since beginning of 3D who've read hundreds if not thousands of reviews over the past decades and some of doing our own reviews is that CPU bottleneck applies to lower resolutions and or in some cases likes of bad console ports and some RTS, MMO, etc but in most situations THAT bottleneck shifts to GPU when you raise up the resolution.

The benches don't lie. From 4ghz to 2ghz CPU bottleneck becomes moot in higher resolutions in a GPU limited situation. That was with a 5970 too. This has been done before to death and proven countless times.
 

SRoode

Senior member
Dec 9, 2004
243
0
0
Toyota,

Is the game that you saw the most difference in (RE5) a console port (I'm not sure)?

It would be interesting for you to run your test again with a 5850 instead of the 260. I wonder if the minimum frame rates would increase. What I mean is, are you sure that your min frame rate is CPU limited? Try the test again with a more powerful video card to see.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
All his 3 examples are console ports.

Batman AA, while it doesn't seem to be a bad console port has physX on high.

This is what I expect for future PC resource scaling on games:

http://www.pcgameshardware.com/aid,...rks-75-percent-boost-for-quad-cores/Practice/

with more cores providing a bigger boost.

But it may as well end like this:

http://www.pcgameshardware.com/aid,...nom-strong-Update-Lynnfield-results/Practice/

with speed of 2 cores and cache size providing the bigger boost.

In the first situation GPU will be of more importance and faster GPUs will always provide the most increase. In the second situation CPU will limit the benefits of faster GPUs.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Common sense? You were trying to say a faster CPU and 8600gt would let you play Crysis at high settings in your previous endeavors.

Now some of us who's been around since beginning of 3D who've read hundreds if not thousands of reviews over the past decades and some of doing our own reviews is that CPU bottleneck applies to lower resolutions and or in some cases likes of bad console ports and some RTS, MMO, etc but in most situations THAT bottleneck shifts to GPU when you raise up the resolution.

The benches don't lie. From 4ghz to 2ghz CPU bottleneck becomes moot in higher resolutions in a GPU limited situation. That was with a 5970 too. This has been done before to death and proven countless times.
I am not talking about the 8600gt. I am talking about the current topic. also your example was a four core i7 and has nothing to do with the ops cpu. that cpu even lowered to 2.0 would run rings around the ops 2.8 5600 X2.

does it look like I am using low resolutions? perhaps you should open your eyes because I just showed you THREE clear examples of how a cpu like his easily holds back my gtx260 even at 1920. that holds true for many other games too but why waste time posting those numbers either.

just look at the RE 5 numbers alone. what does that tell you when a cpu is only letting the game achieve 27 fps?? if you doubt my numbers run the fixed bench for yourself.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
All his 3 examples are console ports.

Batman AA, while it doesn't seem to be a bad console port has physX on high.
those are REAL games run at REAL settings. almost every game out there is a console console port so you are going to dismiss 90% of games. I guess I should certainly not add GTA 4, Ghostbusters and Far Cry 2 to the list either? oh and lets leave out all those games like Dragon Age that would show a massive improvement with having a cpu better than a 5600 X2. I guess I should only test STALKER or maybe Crysis since those seem to be the most gpu dependent games.

btw I left Phsyx on high for Batman so as to put a huge load on the gpu. if I didnt do that then somebody would bitch that I was too cpu dependent. I could test almost every popular game at there and all it will be is nothing but excuses from most of the people on here. I really dont feel like wasting any more of my time on this. the op asked if a 5600 X2 would noticeably bottleneck a 5850 and the answer is yes and quite badly in several games.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Toyota,

Is the game that you saw the most difference in (RE5) a console port (I'm not sure)?

It would be interesting for you to run your test again with a 5850 instead of the 260. I wonder if the minimum frame rates would increase. What I mean is, are you sure that your min frame rate is CPU limited? Try the test again with a more powerful video card to see.
as you can see when I overclocked the cpu to 3.48 I picked up even more performance in RE 5. also while having my cpu at 1.8 I still got the same fps whether my gpu was at 465 or 666 because I was nearly 100% cpu limited at that point in that game. the other games certainly gave up quite a bit of performance too but nothing like RE 5.
 
Last edited:

AzN

Banned
Nov 26, 2001
4,112
2
0
Toyota results seemed a little off to me. Looking at minimum max and average it just didn't add up. I benched batman and RE5 since I have those games.

I'm running my system below and I've clocked it accordingly considering my c2d is the older e6300 2mg cache with the 1st c2d supported motherboard to match OP's CPU performance level. I did multiple runs just to make sure what I'm getting isn't abnormal.

Batman highest settings @ 1920x1080 4xAA physx Normal
315x7=2205mhz

1st Run
Min = 17
Max = 84
avg = 46

2nd run
Min = 24
Max = 61
avg = 45

3rd run
Min = 26
Max = 64
Avg = 45

1st run was an anomaly where min and max seems way off but the avg frame rates stayed around the same.

435x7=3045mhz
1st run
Min = 27
Max = 64
avg = 47

2nd run
Min = 25
Max = 66
avg = 47

3rd run
Min = 27
Max = 63
avg = 47

Consistent results. When it all and done we are talking about 2fps difference from 2205mhz to 3045mhz.

Resident Evil 5 @ 1920x1080 4xAA
Fixed bench ( this is where you have 30-50 zombies on screen )
315x7=2205mhz
28.1fps

435x7=3045mhz
32.1fps

It seems to me this portion of the test is testing your CPU not your GPU. I've lowered my settings to 2xAA @ 3045mhz just to get lower FPS of 31.6. This is the worst case scenario in this game. Now I play this game @ 1920x1080 16QAA and I have no problems running this game at this setting and still get 45-60fps avg in the game.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
I am not talking about the 8600gt. I am talking about the current topic. also your example was a four core i7 and has nothing to do with the ops cpu. that cpu even lowered to 2.0 would run rings around the ops 2.8 5600 X2.

does it look like I am using low resolutions? perhaps you should open your eyes because I just showed you THREE clear examples of how a cpu like his easily holds back my gtx260 even at 1920. that holds true for many other games too but why waste time posting those numbers either.

just look at the RE 5 numbers alone. what does that tell you when a cpu is only letting the game achieve 27 fps?? if you doubt my numbers run the fixed bench for yourself.

You wanted to talk about common sense. I'm just pointing out your previous common senses led you.

Look I already did the benchmarks on top of your batman and Re5. Your benches never do add up. your point out anomaly and that is the bases of your arguments.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
You wanted to talk about common sense. I'm just pointing out your previous common senses led you.

Look I already did the benchmarks on top of your batman and Re5. Your benches never do add up. your point out anomaly and that is the bases of your arguments.
and your RE 5 numbers dont add up at all. you claim to only get a 4 fps boost from 2.2 to 3.0 and that is not accurate. you only get 32fps while I am getting 46fps with just a slightly faster cpu speed than yours. not to mention your video card is faster.

as for Batman I will go back and rerun it with those exact settings you are using. btw if you are using the built in benchmark for Batman it is not very accurate and will show about the same minimums even when running very fast setup. I use fraps and its way more accurate.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
and your RE 5 numbers dont add up at all. you claim to only get a 4 fps boost from 2.2 to 3.0 and that is not accurate. you only get 32fps while I am getting 46fps with just a slightly faster cpu speed than yours. not to mention your video card is faster.

Your results seem to double as you raise clock speeds which is HIGHLY UNLIKELY.


as for Batman I will go back and rerun it with those exact settings you are using. btw if you are using the built in benchmark for Batman it is not very accurate and will show about the same minimums even when running very fast setup. I use fraps and its way more accurate.

Fraps is accurate than the built in benchmark where your hard drive is thrashing all around + using CPU cycles to give you the results? I find that hard to believe.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Your results seem to double as you raise clock speeds which is HIGHLY UNLIKELY.




Fraps is accurate than the built in benchmark where your hard drive is thrashing all around + using CPU cycles to give you the results? I find that hard to believe.



the cpu makes a massive difference in the fixed benchmark and many cpus are down in the low 30s and 20s in that cpu review on pcgh. yes I know they ran theirs at 1680 but that doesnt mean the cpus getting well into the 50-60fps or faster are going to magically come down the 20-30fps range by going to 1920. your numbers are the ones that dont make sense for that benchmark. I also have no idea how you get those numbers in Batman but in my experience the built in benchmark can be all over the place even doing back to back runs. anyway here are the numbers I ran for Batman and RE 5 while the forums were down for a few hours.


Batman very high settings normal physx 1920x1080 4x AA

cpu 3.16 gpu 620/1296/2160
Min, Max, Avg
30, 41, 35.120

cpu 1.8 gpu 620/1296/2160
Min, Max, Avg
23, 31, 26.254

I used fraps because the Batman benchmark occasionally screws up for me and will show insane maximum framerates well over 100fps on some runs. btw fraps doesnt really hit the hard drive and uses 1% of the cpu while benchmarking. now recording with fraps does but benchmarking absolutely does NOT. in RE 5 I ran fraps benchmarking in the background and it had zero effect and I got the same benchmark score from the game whether fraps was on or not. again only recording not benchmarking with fraps affects any game performance.

I did 5 fraps runs at each cpu speed with only one enemy on the screen for consistency. I threw out the lowest and the highest runs and averaged each(min,max,avg) score individually. now actual gameplay with several enemies was pretty sluggish at times with the cpu at 1.8 but it was too hard to get consistent enough results on each run.



RE 5 1920x1080 4x AA fixed bench

cpu 3.48 gpu 620/1296/2160
51.8 fps

cpu 3.48 gpu 465/971/1620
51.0 fps


cpu 3.16 gpu 620/1296/2160
49.6 fps

cpu 3.16 gpu 465/971/1620
49.4 fps


cpu 1.80 gpu 620/1296/2160
31.5 fps

cpu 1.8 gpu 465/971/1620
31.8 fps


yes in this benchmark I did directly compare gpu to cpu. not only does this game take huge advantage of a faster cpu the gpu speed on the gtx260 seemed to matter very little beyond 465. and yes I have screenshots if you would like them. I also played the actual game and tbh at 1.8 it wasnt even noticeable until there was a bunch of enemies and even then not really a big problem. makes me wonder what that fixed benchmark is supposed to represent.

btw the numbers are a few fps better than yesterday even with 4x AA. I had vsync on in RE 5 in the other numbers I posted yesterday so I will go edit that. it didnt make any difference though since all the numbers were still consistent. again though i will edit those results just for accuracy.

I also decided to stick with 620/1296/2160 this time since that is my stock factory settings. I lowered to stock just to make sure it wasnt my gpu causing any issues with Batman built in benchmark and left it here. also was getting tired of overclocking and underclocking everything. lol
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
those are REAL games run at REAL settings. almost every game out there is a console console port so you are going to dismiss 90% of games. I guess I should certainly not add GTA 4, Ghostbusters and Far Cry 2 to the list either? oh and lets leave out all those games like Dragon Age that would show a massive improvement with having a cpu better than a 5600 X2. I guess I should only test STALKER or maybe Crysis since those seem to be the most gpu dependent games.

First, Dragon Age - shows increases by number of cores. You don't see 30% increase from a E6600 to an Athlon X2 6000+.

Farcry 2 - http://www.pcgameshardware.com/aid,663817/Far-Cry-2-GPU-and-CPU-benchmarks/Reviews/?page=2 . Again more of the same. Main bottleneck GPU.

I could test almost every popular game at there and all it will be is nothing but excuses from most of the people on here. I really dont feel like wasting any more of my time on this. the op asked if a 5600 X2 would noticeably bottleneck a 5850 and the answer is yes and quite badly in several games.
And in even more games the bottleneck wouldn't be noticeable.

My Resident Evil 5 results, 1920x1080 4xAA all max motion blur on

Athlon x2 6000 @3.0 GHz with 4850 720/1000
30.6 FPS

CPU @2.25GHz 4850 720/1000
22.2 FPS

CPU @3.0 GHz 4850 540/750
26.3 FPS


Don't have Batman AA. Any benchmark tool available. Or does the demo has the bench tool?
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
First, Dragon Age - shows increases by number of cores. You don't see 30% increase from a E6600 to an Athlon X2 6000+.

Farcry 2 - http://www.pcgameshardware.com/aid,663817/Far-Cry-2-GPU-and-CPU-benchmarks/Reviews/?page=2 . Again more of the same. Main bottleneck GPU.


And in even more games the bottleneck wouldn't be noticeable.

My Resident Evil 5 results, 1920x1080 4xAA all max motion blur on

Athlon x2 6000 @3.0 GHz with 4850 720/1000
30.6 FPS

CPU @2.25GHz 4850 720/1000
22.2 FPS

CPU @3.0 GHz 4850 540/750
26.3 FPS


Don't have Batman AA. Any benchmark tool available. Or does the demo has the bench tool?

well my point was not comparing my cpu to a quad. I was just showing that SOME games like it and GTA 4 will take advantage of more cores so in those cases thats even more performance down the drain. with Far Cry 2 if you throw in a 5850 instead of that 4870 he will still be missing out on at least 25-30% what the faster cpus can deliver compared to his 5600 X2. now other games will certainly use the faster clock for clock cpu speed of the i5 the op is getting. only a game like STALKER will not change much at 1920 when going from a 5600 X2 to a E8500 or even i7/i5. Crysis also would move very little too but would certainly still give up a few to several fps especially for the min framerate. again its NOT just about average framerate. its the minimum framerate and while using a gpu like the 5850 with a 5600 X2 he will will be missing out on quite a bit of playability that the 5850 is capable of.


look at your RE numbers. with your X2 cpu at 3.0 an your 4850 overclocked you are almost even with my cpu at 1.8 and my 192 gtx260 severely underclocked. there is no doubt that with your 4850 at 720 that it would be pretty close to a 192sp gtx260 at 465 and your cpu at 3.0 is pretty close to mine at 1.8. so if anything that backs up what I am saying. now once I increase the gpu speed NOTHING changes. about the level of 465 on my gtx260 is where the cpu starts become the limiter at least for the fixed benchmark. when I increase the cpu speed the framerates go way up. this is noticeable in the actual game too especially when many enemies are on the screen.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
again its NOT just about average framerate. its the minimum framerate and while using a gpu like the 5850 with a 5600 X2 he will will be missing out on quite a bit of playability that the 5850 is capable of.

I'm collecting data on several games. Now that we have determined that at least an athlon x2 @3.0 GHz is around a E6400, I'll try to post that data once it's presentable and retested.

But I'll say, data of this game is quite different. On the other hand the non fixed benchmark is over 50 FPS, and minimums aren't bad either.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I'm collecting data on several games. Now that we have determined that at least an athlon x2 @3.0 GHz is around a E6400, I'll try to post that data once it's presentable and retested.

But I'll say, data of this game is quite different. On the other hand the non fixed benchmark is over 50 FPS, and minimums aren't bad either.
it is a strange benchmark for sure. I tested the real game though too just like I always do. it was perfectly playable with the E8500 at 1.8 but it was a tad sluggish with many enemies. still no where near as bad as the benchmark would indicate.

when I test things I see if a games feels different then I turn on fraps to see what the actual difference is. the funny thing is I try not to test with tons of enemies in games and thats ironically the best time to show the difference. anyway my conclusions are partly based on how much actual performance is being given up. if its more than 20-25% with my gtx260 then that will be even more "potential" performance down the drain using a really high end card. the rest of my conclusion is based on does having the cpu at a certain speed affect the actual game play.

with my cpu at 1.8 which is basically like a 2.6 5600 X2 in modern games I give up quite a bit in newer games compared to running cpu at stock. now most games are perfectly playable but several like Red Faction Guerrilla, GTA 4, Ghostbusters and some others are borderline. to me its not smart to spend over 300 bucks on a gpu that will NOT improve those borderline games because its the cpu that is responsible. again the excuse of using more AA seems silly in those cases. I would much rather have much higher minimums and better playability then to turn on more AA because my cpu is too slow to even keep up in those borderline games.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
If a person buys HD5850 (with older CPU) then why not consider using Eyefinity instead of the motherboard and CPU swap/hassle?

Maximum value per dollar per frame rate (with single monitor)? Or just settle for having the game play smooth enough with better FOV?
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
it is a strange benchmark for sure. I tested the real game though too just like I always do. it was perfectly playable with the E8500 at 1.8 but it was certainly a tad sluggish with many enemies.

when I test things I see if a games feels different then I turn on fraps to see what the actual difference is. the funny thing is I try not to test with tons of enemies in games and thats ironically the best time to show the difference. anyway my conclusions are partly based on how much actual performance is being given up. if its more than 20-25% with my gtx260 then that will be even more "potential" performance down the drain using a really high end card. the rest of my conclusion is based on does having the cpu at a certain speed affect the actual game play.

with my cpu at 1.8 which is basically like a 2.6 5600 X2 in modern games I give up quite a bit in newer games compared to running cpu at stock. now most games are perfectly playable but several like Red Faction Guerrilla, GTA 4, Ghostbusters and some others are borderline. to me its not smart to spend over 300 bucks on a gpu that will NOT improve those borderline games because its the cpu that is responsible. again the excuse of using more AA seems silly in those cases. I would much rather have much higher minimums and better playability then to turn on more AA because my cpu is too slow to even keep up in those borderline games.

Precisely. I don't even think anyone has been trying to argue that the CPU is more important than GPU, just that although the GPU is most critical, ignoring CPU speed is retarded. Expecting a ~4 year old CPU to be competitive with brand new games with just a new video card is not a great idea. Keeping the game from chugging during difficult sections is paramount to maintaining a good time.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
If a person buys HD5850 (with older CPU) then why not consider using Eyefinity instead of the motherboard and CPU swap/hassle?

Maximum value per dollar per frame rate (with single monitor)? Or just settle for having the game play smooth enough with better FOV?
what? a 15-20 fps minimum framerate in some games would be no better with 3 monitors than it would be with 1. makes much more sense to have a fairly balanced platform that can deliver performance without dropping the ball on minimum framerates and overall playability.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Precisely. I don't even think anyone has been trying to argue that the CPU is more important than GPU, just that although the GPU is most critical, ignoring CPU speed is retarded. Expecting a ~4 year old CPU to be competitive with brand new games with just a new video card is not a great idea. Keeping the game from chugging during difficult sections is paramount to maintaining a good time.
thank you :)
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
Precisely. I don't even think anyone has been trying to argue that the CPU is more important than GPU, just that although the GPU is most critical, ignoring CPU speed is retarded. Expecting a ~4 year old CPU to be competitive with brand new games with just a new video card is not a great idea. Keeping the game from chugging during difficult sections is paramount to maintaining a good time.

I think the main question is "What is more representative of games, Resident Evil 5 and GTA IV (more cores matter less than extra cache http://www.pcgameshardware.com/aid,...ark-review-with-13-processors/Reviews/?page=2 ) or games like Dragon Age (that scales pretty well with number of cores) and FarCry 2?

Because even considering dual-cores are starting to become obsolete, slapping an Athlon II X4 and getting great core scale is very different, cost wise, from requiring an i5/i7.

EDIT: Actually GTA IV scales well with number of cores and cache.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
I think the main question is "What is more representative of games, Resident Evil 5 and GTA IV (more cores matter less than extra cache http://www.pcgameshardware.com/aid,...ark-review-with-13-processors/Reviews/?page=2 ) or games like Dragon Age (that scales pretty well with number of cores) and FarCry 2?

Because even considering dual-cores are starting to become obsolete, slapping an Athlon II X4 and getting great core scale is very different, cost wise, from requiring an i5/i7.
well if you are going to upgrade mobo and cpu then it makes since to get something that kicks butt now and will for quite a while. for 100-150 bucks more he will not need to upgrade anything but the gpu for the next 3-4 years. saving a little money now to go with an Athlon 2 that already gives up a little bit compared to running an i5/i7 could be more costly down the road as gpus nearly double their performance every gen. nothing wrong with going with an Athlon 2 but I think if your the type of person that like to not upgrade the mobo and cpu then the i5 platform is better. I almost said futureproof but I hate that word. lol
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
I think the main question is "What is more representative of games, Resident Evil 5 and GTA IV (more cores matter less than extra cache http://www.pcgameshardware.com/aid,...ark-review-with-13-processors/Reviews/?page=2 ) or games like Dragon Age (that scales pretty well with number of cores) and FarCry 2?

Because even considering dual-cores are starting to become obsolete, slapping an Athlon II X4 and getting great core scale is very different, cost wise, from requiring an i5/i7.

EDIT: Actually GTA IV scales well with number of cores and cache.

Not sure what you're saying here, but if we're talking new builds, then this generally holds true for gamers :

Get a midrange CPU, and overclock it if you are so inclined :)
Get the standard current amount of RAM for 'premium' but not 'extreme' systems.
Get the best GPU you can afford.

So for now, yeah a gamer could do great with say : Athlon II X4, 4GB Ram, and a 5850/5870.

GTA IV was programmed by a team of zombie monkeys, and indeed wants a monster CPU *and* GPU to approach smoothness.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
what? a 15-20 fps minimum framerate in some games would be no better with 3 monitors than it would be with 1. makes much more sense to have a fairly balanced platform that can deliver performance without dropping the ball on minimum framerates and overall playability.

Yeah but most games just don't bottom out that low with older CPUs.

In fact, I can't even find a game I like that needs a quad core for playability myself (but everyone is different)