How bottlenecked will this be?

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

decadentia84

Junior Member
Nov 22, 2009
23
0
0
Personally I just cant wait to see the difference from

x2 5600+
8800GT

to

i5 750
ATI HD 5850

Dragon age runs a bit sluggish right now, but with this I should completely smooth it out (among other things).
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
well if you are going to upgrade mobo and cpu then it makes since to get something that kicks butt now and will for quite a while. for 100-150 bucks more he will not need to upgrade anything but the gpu for the next 3-4 years. saving a little money now to go with an Athlon 2 that already gives up a little bit compared to running an i5/i7 could be more costly down the road as gpus nearly double their performance every gen. nothing wrong with going with an Athlon 2 but I think if your the type of person that like to not upgrade the mobo and cpu then the i5 platform is better. I almost said futureproof but I hate that word. lol

Not sure what you're saying here, but if we're talking new builds, then this generally holds true for gamers :

Get a midrange CPU, and overclock it if you are so inclined :)
Get the standard current amount of RAM for 'premium' but not 'extreme' systems.
Get the best GPU you can afford.

So for now, yeah a gamer could do great with say : Athlon II X4, 4GB Ram, and a 5850/5870.

GTA IV was programmed by a team of zombie monkeys, and indeed wants a monster CPU *and* GPU to approach smoothness.

I guess I'm trying to say something like that.

It is true that both the Athlon X2 and Core2Duo E6x00 are starting to show their age, and all the dual-cores are starting to feel the pressure of multi-threaded games, but a Q6600 is still doing good.

Well, a Q6600 is doing good except in a game like Resident Evil 5, while it does good in a game like Dragon Age.

I'm really not sure if Resident Evil 5 isn't a really bad optimized game and that "run great on i7" screen doesn't tranquilize me much :p.

I will be upgrading soon, and those Athlon II x4 are extremely seductive - Throw some $100/€90 CPU + $/€70-80 mobo and assuming you buying 4GB ran for that system or an i5/Phenom II, you walk away with $/€100 or more to invest in a better GPU.

I don't know in US, but in here an i5 750 costs 2x more, a phenom II 955 €70 more and a Q9550 is €220 (€60 more than the phenom II) :eek: if you can find it.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
Personally I just cant wait to see the difference from

x2 5600+
8800GT

to

i5 750
ATI HD 5850

Dragon age runs a bit sluggish right now, but with this I should completely smooth it out (among other things).

What are you doing with your old machine?

If you are keeping it for a while, can you please see if the 5850 makes any difference on that Athlon over the 8800GT?

EDIT: BTW I just installed that AMD fusion utility - got me 1 extra FPS on that RE5 bench - also seems to have increased the minimum. CPU 3.0GHz GPU@ 720/1000 the same.
 
Last edited:

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Personally I just cant wait to see the difference from

x2 5600+
8800GT

to

i5 750
ATI HD 5850

Dragon age runs a bit sluggish right now, but with this I should completely smooth it out (among other things).

It's gonna be epic good man :)
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Personally I just cant wait to see the difference from

x2 5600+
8800GT

to

i5 750
ATI HD 5850

Dragon age runs a bit sluggish right now, but with this I should completely smooth it out (among other things).

First do us a favor and run some games with the 5600 and 5850.:D
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Yeah but most games just don't bottom out that low with older CPUs.

In fact, I can't even find a game I like that needs a quad core for playability myself (but everyone is different)

indeed. I can enjoy all my games @ 2.2ghz or 3.2ghz. Processor doesn't matter match when I already get 60fps or more except for few bad console ports.

RE5 might scale with multiple cores and not with GPU because the engine is geared to crappy GPU on consoles. I still get 45-60fps average in that game with 16xQAA. On a 3rd person shooter single player campaign you don't even need that much.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
indeed. I can enjoy all my games @ 2.2ghz or 3.2ghz. Processor doesn't matter match when I already get 60fps or more except for few bad console ports.

RE5 might scale with multiple cores and not with GPU because the engine is geared to crappy GPU on consoles. I still get 45-60fps average in that game with 16xQAA. On a 3rd person shooter single player campaign you don't even need that much.
yes MOST games dont. there are still plenty of games that will have low framerates and a 5850 doesnt fix that though. you will still dip in the low 20s and teens in several games with a 5600 X2 and even in games that are playable you will still have way lower minimums compared to having a good cpu. again its about playability and getting the most out of your $300 video card. honestly if its my money I would rather have a 4890 and competent cpu than a 5850 and 5600 X2 any day of the week. who cares how much AA you can use if your cpu is too sluggish to improve the lows with a high end card. I notice slow downs a lot more than I notice jaggies.

the OP realized that he was going to need to upgrade his cpu at some point soon anyway so doing it before plunking down 300 bucks on a gpu was wise. not only will he have a very powerful system now that can deliver in every way he will also have a cpu that will be viable for many gpu upgrades. IMO he did the right thing as using a 5850 with a cpu like a 5600 X2 would not have justified that $300 5850 price tag.
 
Last edited:

AzN

Banned
Nov 26, 2001
4,112
2
0
yes MOST games dont. there are still plenty of games that will have low framerates and a 5850 doesnt fix that though. you will still dip in the low 20s and teens in several games with a 5600 X2 and even in games that are playable you will still have way lower minimums compared to having a good cpu. again its about playability and getting the most out of your $300 video card. honestly if its my money I would rather have a 4890 and competent cpu than a 5850 and 5600 X2 any day of the week. who cares how much AA you can use if your cpu is too sluggish to improve the lows with a high end card. I notice slow downs a lot more than I notice jaggies. using a 5850 with a cpu like a 5600 X2 is a rather foolish waste of 300 bucks IMO.

the OP realized that he was going to need to upgrade his cpu at some point soon anyway so doing it before plunking down 300 bucks on a gpu was wise. not only will he have a very powerful system now that can deliver in every way he will also have a cpu that will be viable for many gpu upgrades. IMO he did the right thing.

Plenty like? 2 shitty console ported games that's geared with 3 cores of xbox360 and 7 cores of PS3? GTA4 and Red Faction precisely. Which your E8400 has hard time with GTA4 as well. 5600 x2 is plenty for Batman and so is Resident Evil 5. You only got shitty frame rates in Batman is because you enabled physX which 5850 or 4890 doesn't do. Resident Evil runs like butter in any old system.

$300 5850 is more wise investment than $500 i7 combo for games. A 4890 is not what the OP has. It is an 8800gt that isn't competent @ 1920x1200. He was going to upgrade his CPU regardless. Why not enjoy the graphic goodness now? Price cut? Ha. Price went up within the last few months.

In the end I would rather get $100 CPU and get the best graphics card I could afford when all you want to do is play games. My $150 CPU lasted me for 3 years and still strong for crying out loud while I've upgraded 3 different GPU's in that same time frame.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Plenty like? 2 shitty console ported games that's geared with 3 cores of xbox360 and 7 cores of PS3? GTA4 and Red Faction precisely. Which your E8400 has hard time with GTA4 as well. 5600 x2 is plenty for Batman and so is Resident Evil 5. You only got shitty frame rates in Batman is because you enabled physX which 5850 or 4890 doesn't do. Resident Evil runs like butter in any old system.

$300 5850 is more wise investment than $500 i7 combo for games. A 4890 is not what the OP has. It is an 8800gt that isn't competent @ 1920x1200.

In the end I would rather get $100 CPU and get the best graphics card I could afford when all you want to do is play games. My $150 CPU lasted me for 3 years and still strong for crying out loud while I've been upgraded 3 different GPU's in that time frame.
its more than two games that would be a little too sluggish or borderline as for as I am concerned. also phsyx is supposedly gpu related only in Batman anyway so I dont see the problem in using it. yeah I do agree that RE 5 is still playable on an older system but that goes for video card too. even in games that are perfectly playable its still way too much gpu power down the drain using a 5850 with a 5600 X2 IMO.

I know he has an 8800gt and I didnt say it wouldnt be an upgrade. I was basically saying if it was my money I wouldnt put a $300 card with that cpu. btw your cpu while overclocked is still extremely strong and his 5600 X2 pales in comparison. you know that or you wouldnt waste your time overclocking it. just like an i5 or i7 will last for a real long time with its very good performance per clock, its excellent scaling with high end gpus and ability to oc really well.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
In the end I would rather get $100 CPU and get the best graphics card I could afford when all you want to do is play games. My $150 CPU lasted me for 3 years and still strong for crying out loud while I've upgraded 3 different GPU's in that same time frame.

Not only that but if someone is buying HD5850 using Eyefinity sounds a lot more high yield than swapping out mobos/CPUs/OS just to get a few more Frames per second.

It really is hard to find a game that genuinely needs the very best and latest desktop CPU.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
its more than two games that would be a little too sluggish or borderline as for as I am concerned. also phsyx is supposedly gpu related only in Batman anyway so I dont see the problem in using it. yeah I do agree that RE 5 is still playable on an older system but that goes for video card too. even in games that are perfectly playable its still way too much gpu power down the drain using a 5850 with a 5600 X2 IMO.

I know he has an 8800gt and I didnt say it wouldnt be an upgrade. I was basically saying if it was my money I wouldnt put a $300 card with that cpu. btw your cpu while overclocked is still extremely strong and his 5600 X2 pales in comparison. you know that or you wouldnt waste your time overclocking it. just like an i5 or i7 will last for a real long time with its very good performance per clock, its excellent scaling with high end gpus and ability to oc really well.

Maybe 5 games total with hundreds of PC titles out there. Yet you can't even name those other games. More shitty bad console ports I suppose.

I ran batman without Physx 1080p 4xaa. I was getting 80fps average with my CPU @ 2.205ghz and 90fps average @ 3.045ghz Batman not playable on x2 5600 you say? Whatever you say...

OP was going to get a mobo and CPU regardless in a few months when he ordered the 5850. It would have been mass improvement over his 8800gt than an i7 with his current video card anyday. It would have been twice as fast as his 8800gt would give at the same gpu settings of 1920x1200 at desired settings. It would also have cost him less. Look most of are not saying extra performance could be had with a CPU upgrade but most of the games out there are minuscule at best particularly at OP's desired resolutions and settings.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Batman highest settings, physx high and 2x AA fraps run

cpu 3.16 gpu 666/1392/2200
Min, Max, Avg
23, 70, 43.122

cpu 1.80 gpu 666/1392/2200
Min, Max, Avg
16, 59, 35.975

having physx on high along with 2x AA and the other setting should make this very gpu limited. as you can see though having the cpu a equivalent to his 5600 X2 really brought the minimum framerate down to very sluggish levels.
[/B]

Batman highest settings @ 1920x1080 4xAA physx Normal
315x7=2205mhz

1st Run
Min = 17
Max = 84
avg = 46

2nd run
Min = 24
Max = 61
avg = 45

3rd run
Min = 26
Max = 64
Avg = 45

1st run was an anomaly where min and max seems way off but the avg frame rates stayed around the same.

435x7=3045mhz
1st run
Min = 27
Max = 64
avg = 47

2nd run
Min = 25
Max = 66
avg = 47

3rd run
Min = 27
Max = 63
avg = 47

It seems like your benchmark is off.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Not only that but if someone is buying HD5850 using Eyefinity sounds a lot more high yield than swapping out mobos/CPUs/OS just to get a few more Frames per second.

It really is hard to find a game that genuinely needs the very best and latest desktop CPU.
its not about getting a few more fps. its about getting better overall playability and the most for your $300 video card. I have said it plenty of times now but I would rather have a much higher minimum framerate then to run a ton of AA. I will take a more consistent 40-50 fps over a game that dips down in the low 20s and then back into the 60s.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Batman highest settings @ 1920x1080 4xAA physx Normal
315x7=2205mhz

1st Run
Min = 17
Max = 84
avg = 46

2nd run
Min = 24
Max = 61
avg = 45

3rd run
Min = 26
Max = 64
Avg = 45

1st run was an anomaly where min and max seems way off but the avg frame rates stayed around the same.

435x7=3045mhz
1st run
Min = 27
Max = 64
avg = 47

2nd run
Min = 25
Max = 66
avg = 47

3rd run
Min = 27
Max = 63
avg = 47

It seems like your benchmark is off.
I already responded to this earlier but here it is again.

Batman very high settings normal physx 1920x1080 4x AA

cpu 3.16 gpu 620/1296/2160
Min, Max, Avg
30, 41, 35.120

cpu 1.8 gpu 620/1296/2160
Min, Max, Avg
23, 31, 26.254

I used fraps because the Batman benchmark occasionally screws up for me and will show insane maximum framerates well over 100fps on some runs. btw fraps doesnt really hit the hard drive and uses 1% of the cpu while benchmarking. now recording with fraps does but benchmarking absolutely does NOT. in RE 5 I ran fraps benchmarking in the background and it had zero effect and I got the same benchmark score from the game whether fraps was on or not. again only recording not benchmarking with fraps affects any game performance.

I did 5 fraps runs at each cpu speed with only one enemy on the screen for consistency. I threw out the lowest and the highest runs and averaged each(min,max,avg) score individually. now actual gameplay with several enemies was pretty sluggish at times with the cpu at 1.8 but it was too hard to get consistent enough results on each run. even when just playing the game with my cpu at 1.8 I can most certainly feel it being more sluggish while fighting enemies. I didnt take the time to run it without physx but I guess its likely it could help out on the low end as well as max framerate.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Maybe 5 games total with hundreds of PC titles out there. Yet you can't even name those other games. More shitty bad console ports I suppose.

I ran batman without Physx 1080p 4xaa. I was getting 80fps average with my CPU @ 2.205ghz and 90fps average @ 3.045ghz Batman not playable on x2 5600 you say? Whatever you say...

OP was going to get a mobo and CPU regardless in a few months when he ordered the 5850. It would have been mass improvement over his 8800gt than an i7 with his current video card anyday. It would have been twice as fast as his 8800gt would give at the same gpu settings of 1920x1200 at desired settings. It would also have cost him less. Look most of are not saying extra performance could be had with a CPU upgrade but most of the games out there are minuscule at best particularly at OP's desired resolutions and settings.
yes it may be less than 10 games total that are sluggish or borderline playable but if you want to play those then it does matter. Red Faction Guerrilla was quite painful during intense action with my cpu at 1.8. call it a crappy console port if you want but it was an okay game that played WAY better with my cpu at its stock speed. although still playable there would be games that would never get minimums out of the 20s and 30s with a 5600 X2 no matter what gpu is used. also the OP mentioned Dragon Age being sluggish which sounds right since it would only be in the teens and 20s with his current cpu. I am assuming he will enjoy the massive jump there with his better cpu.

as for as Batman goes I have never tested with physx off and my cpu at 1.8 so it may not be an issue like you are saying. I wanted physx on and with my cpu at 1.8 I do not consider it acceptable at times especially since I know how much smoother it plays at stock 3.16. anyway we have both made our points so there really isnt much else to be said.
 
Last edited:

AzN

Banned
Nov 26, 2001
4,112
2
0
yes it may be less than 10 games total that are sluggish or borderline playable but if you want to play those then it does matter. Red Faction Guerrilla was quite painful during intense action with my cpu at 1.8. call it a crappy console port if you want but it was an okay game that played WAY better with my cpu at its stock speed. although still playable there would be games that would never get minimums out of the 20s and 30s with a 5600 X2 no matter what gpu is used. also the OP mentioned Dragon Age being sluggish which sounds right since it would only be in the teens and 20s with his current cpu. I am assuming he will enjoy the massive jump there with his better cpu.

as for as Batman goes I have never tested with physx off and my cpu at 1.8 so it may not be an issue like you are saying. I wanted physx on and with my cpu at 1.8 I do not consider it acceptable at times especially since I know how much smoother it plays at stock 3.16. anyway we have both made our points so there really isnt much else to be said.

And if you want to play HUNDREDS of any other PC games @ 1920x1200 with some form of AA you need better GPU than 8800gt. You get more with GPU than CPU ever will. I can still play any game with my 3 year old CPU without any chugs except for 2 shitty console ports is proof enough. Minimum of 20-30fps is acceptable in any game. It really depends on the game. Most people can live with 30-40fps average in an RPG. 3rd person view games like batman 40-45fps. FPS games like Call of Duty 50-60fps.

It's so devious of you to point to PhysX benchmark of Batman where you cripple the performance into more GPU limited situation and then point to the CPU which I benchmarked 3 times only to have 2fps difference in a GPU limited situation. When I disabled PhysX the benchmark was flying at 80fps average with my CPU @ 2.21ghz. As for Resident Evil 5 the fixed benchmark of the test isn't real gaming situation in the game. It's more like worst case scenario where that might be the lowest fps you would get with a particular CPU. While variable benchmark is actually testing real game play.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
I already responded to this earlier but here it is again.

Batman very high settings normal physx 1920x1080 4x AA

cpu 3.16 gpu 620/1296/2160
Min, Max, Avg
30, 41, 35.120

cpu 1.8 gpu 620/1296/2160
Min, Max, Avg
23, 31, 26.254

I used fraps because the Batman benchmark occasionally screws up for me and will show insane maximum framerates well over 100fps on some runs. btw fraps doesnt really hit the hard drive and uses 1% of the cpu while benchmarking. now recording with fraps does but benchmarking absolutely does NOT. in RE 5 I ran fraps benchmarking in the background and it had zero effect and I got the same benchmark score from the game whether fraps was on or not. again only recording not benchmarking with fraps affects any game performance.

I did 5 fraps runs at each cpu speed with only one enemy on the screen for consistency. I threw out the lowest and the highest runs and averaged each(min,max,avg) score individually. now actual gameplay with several enemies was pretty sluggish at times with the cpu at 1.8 but it was too hard to get consistent enough results on each run. even when just playing the game with my cpu at 1.8 I can most certainly feel it being more sluggish while fighting enemies. I didnt take the time to run it without physx but I guess its likely it could help out on the low end as well as max framerate.

That's why I benched multiple times. There is definitely an overhead with fraps especially when you are recording frame rates. Anyways my benches doesn't show that big of a difference. Your benches are off because you are actually testing 2 different scenarios while I benched with same fixed bench to test same scenario.
 
Last edited:

AzN

Banned
Nov 26, 2001
4,112
2
0
I guess I'm trying to say something like that.
Well, a Q6600 is doing good except in a game like Resident Evil 5, while it does good in a game like Dragon Age.

I'm really not sure if Resident Evil 5 isn't a really bad optimized game and that "run great on i7" screen doesn't tranquilize me much :p.

Q6600 hangs with E8400 in RE5. Why do we need 100fps in RE5 again? Fixed benchmark is not actual game play. Variable benchmark is what counts in this game as it's showing real gaming situations which I'm getting 50-60fps average @ 1920x1080 16xQ with my CPU @ 3.045ghz.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
Q6600 hangs with E8400 in RE5. Why do we need 100fps in RE5 again? Fixed benchmark is not actual game play. Variable benchmark is what counts in this game as it's showing real gaming situations which I'm getting 50-60fps average @ 1920x1080 16xQ with my CPU @ 3.045ghz.

You are probably right - I've no experience with the game. I've only ran the benchmarks - and the variable one was indeed showing frame rates a bit higher if I recall correctly.

I'm not sure if my 4850 512 MB could get those numbers though, at that resolution/AA levels.
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
I think the main question is "What is more representative of games, Resident Evil 5 and GTA IV (more cores matter less than extra cache http://www.pcgameshardware.com/aid,...ark-review-with-13-processors/Reviews/?page=2 ) or games like Dragon Age (that scales pretty well with number of cores) and FarCry 2?

Because even considering dual-cores are starting to become obsolete, slapping an Athlon II X4 and getting great core scale is very different, cost wise, from requiring an i5/i7.

EDIT: Actually GTA IV scales well with number of cores and cache.
I bolded that last part because many people overlook it when bashing GTA IV (it's not as poorly coded as people make it out to be). I've always made the argument that it's better to get a cheaper quad core over a dual core if you're building a gaming rig, simply because multicore optimization is getting more prevalent.
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
I've always made the argument that it's better to get a cheaper quad core over a dual core if you're building a gaming rig, simply because multicore optimization is getting more prevalent.
Agreed. It's analogous perhaps to investing in the present vs. investing in the future, and carries the same risks associated with both.

If you care more about the present, or simply cannot tolerate risk, then your choice will most likely be a blazing dual-core.

If you care more about the future, and can tolerate the risk that it might not happen as fast or might not pan out as you expect (i.e., multi-core optimizations to happen across the board within your tolerance of time), then you can be happy with a quad-core, either cheaper or more expensive than the blazing dual-core in question.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
I think the main question is "What is more representative of games, Resident Evil 5 and GTA IV (more cores matter less than extra cache http://www.pcgameshardware.com/aid,...ark-review-with-13-processors/Reviews/?page=2 ) or games like Dragon Age (that scales pretty well with number of cores) and FarCry 2?

Because even considering dual-cores are starting to become obsolete, slapping an Athlon II X4 and getting great core scale is very different, cost wise, from requiring an i5/i7.

EDIT: Actually GTA IV scales well with number of cores and cache.

caches matter most with the Core 2 architecture, with the K8 architecture, it makes little difference because it has a IMC and is bottlenecked elsewhere. Even today, Phenom II much of its performance doesn't depend on its cache, it still computing bound.

I don't know in US, but in here an i5 750 costs 2x more, a phenom II 955 €70 more and a Q9550 is €220 (€60 more than the phenom II) :eek: if you can find it.

Actually in benchmarks, the Phenom II goes toe to toe with the Q9550 and sometimes can even rival the Q9770, so getting a Phenom III now will save you money and the performance will be close to the i7 750 in game (The i7 never was far ahead of the Core 2 Quad in gaming anyways except in multi GPU configurations)

Ps: Q6600, one of the best and most affordable chips ever made... it still doing great even today at stock speed with room for overclockability for tomorrow if more performance is needed.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
That's why I benched multiple times. There is definitely an overhead with fraps especially when you are recording frame rates. Anyways my benches doesn't show that big of a difference. Your benches are off because you are actually testing 2 different scenarios while I benched with same fixed bench to test same scenario.

fraps is basically not effecting anything while doing the benchmarking. its what almost everybody uses to check real in game results because it is very accurate and consistent. only if its recording video does it have an impact on performance. and actually I am testing a somewhat scripted part towards the beginning of the game for part of the bench. the other part involves just walking around the same path with little going on.

I spent hours testing the demo with different cpu and gpu speeds before I ever got the full game so I am very familiar with the performance of Batman. my cpu at 1.8 is most certainly sluggish during large fights and that is reflective during the fraps benchmarks. I can just turn on fraps and clearly see the framerate difference between the cpu speeds even without benching with fraps. the only thing I havent done is test Batman with physx off and my cpu at 1.8.

I hate to break it to you but the built in benchmark for Batman is NOT accurate and thats why I dont use it. I am a member over on Eidos site so I know this for a fact. there are people with killer machines that will still get low 20s for minimum framerate. if I personally was getting consistent results then I would use the benchmark but I am not. hell on some occasions it will show me have a max of a 120fps while on the very next run be at 80fps with no changes. when I get home today I will go dig up a thread that came out when the game did that proves this too.

EDIT: I had time to look and here is one of the threads on the Eidos forums. as you can see the built in benchmarks are very crazy and not reflective of actual gameplay for some. sometimes people with gtx285 cards and dedicated physx cards are getting in the 20s for a minimum. in fact I think only 2 people using Nvidia got in the 40s for a minimum while everyone else was in the 20s regardless of what level cpu and gpu. of course the ATI users arent running gpu physx so I just ignore their results. http://forums.eidosgames.com/showthread.php?t=95824

even the mod on there said: "Remember the minimum is more then likely a dip in the performance recorded during a transistion between scenes. So is a bit of a false minimum since it could be a single frame with that rate but as the system records ALL frames it reports this as the minimum."

"Yeh i know what you mean the minimum is the key figure during gameplay, but this benchmark isnt really representative of that its just a flythrough a level and shows some stuff off."


anyway so the best way to test Batman accurately and consistently is using fraps during the actual game. even if you somehow are getting fairly consistent results with the built in benchmark they are NOT accurate and certainly not reflective of in game results.

EDIT 2: I finally went and tested it with physx off and you were right that it is a quite a bit faster even in the minimums. my cpu at 1.8 still got its but kicked by my cpu at stock though. I was basically pegged at the games framerate cap the whole time with my cpu at 3.16 while it would get in the mid 30s easily with the cpu at 1.8. it was completely playable with physx off but still the cpu is pretty significant in this game especially if someone wanted to maybe use vsync. that sounds about right too because it uses the Unreal Engine 3. while using my 5000 X2 in UT3 I got in the low 30s for minimums regardless of whether using an 8600gt, 4670, or 9600gt. of course the averages were much higher and the max went way up with the faster cards with the framerate cap off but the minimums were always within 2-4 fps of each other in UT3. that still proves my point that in many games the 5850 isnt going to help the minimum framerate at all when using a 5600 X2.

1920x1080 highest settings 4x AA physx off
cpu 3.16 gpu 620/1296/2160
Min, Max, Avg
59, 63, 61.541

cpu 1.8 gpu 620/1296/2160
Min, Max, Avg
35, 63, 50.241
 
Last edited:

AzN

Banned
Nov 26, 2001
4,112
2
0
fraps is basically not effecting anything while doing the benchmarking. its what almost everybody uses to check real in game results because it is very accurate and consistent. only if its recording video does it have an impact on performance. and actually I am testing a somewhat scripted part towards the beginning of the game for part of the bench. the other part involves just walking around the same path with little going on.

I spent hours testing the demo with different cpu and gpu speeds before I ever got the full game so I am very familiar with the performance of Batman. my cpu at 1.8 is most certainly sluggish during large fights and that is reflective during the fraps benchmarks. I can just turn on fraps and clearly see the framerate difference between the cpu speeds even without benching with fraps. the only thing I havent done is test Batman with physx off and my cpu at 1.8.

I hate to break it to you but the built in benchmark for Batman is NOT accurate and thats why I dont use it. I am a member over on Eidos site so I know this for a fact. there are people with killer machines that will still get low 20s for minimum framerate. if I personally was getting consistent results then I would use the benchmark but I am not. hell on some occasions it will show me have a max of a 120fps while on the very next run be at 80fps with no changes. when I get home today I will go dig up a thread that came out when the game did that proves this too.

EDIT: I had time to look and here is one of the threads on the Eidos forums. as you can see the built in benchmarks are very crazy and not reflective of actual gameplay for some. sometimes people with gtx285 cards and dedicated physx cards are getting in the 20s for a minimum. in fact I think only 2 people using Nvidia got in the 40s for a minimum while everyone else was in the 20s regardless of what level cpu and gpu. of course the ATI users arent running gpu physx so I just ignore their results. http://forums.eidosgames.com/showthread.php?t=95824

even the mod on there said: "Remember the minimum is more then likely a dip in the performance recorded during a transistion between scenes. So is a bit of a false minimum since it could be a single frame with that rate but as the system records ALL frames it reports this as the minimum."

"Yeh i know what you mean the minimum is the key figure during gameplay, but this benchmark isnt really representative of that its just a flythrough a level and shows some stuff off."


anyway so the best way to test Batman accurately and consistently is using fraps during the actual game. even if you somehow are getting fairly consistent results with the built in benchmark they are NOT accurate and certainly not reflective of in game results.

In your case you had your CPU @ 1.8ghz turn out faster results next and slower on another day. Seems to me that is inaccurate when you don't get the same consistent results as you are testing different scenarios on different occasions.

While I got consistent results and eliminated the anomaly. You spent hours on a demo and benchmark? Perhaps you should play the game and stop concentrating on benchmark numbers as I have no problems playing the game with my cpu @ 2.2ghz or 3.045ghz. Then again I have GTX260 216SP @ faster clocks and pulling faster frame rates than you with your E8500 and GTX-260 192SP @ lower clocks. :D

If you don't use the built in benchmark that's fine but you used built in benchmark in RE5 only to test CPU portion of this test where it's not testing actual game play. And you have NO evidence of Batman benchmark being flawed. Then again reviewers like legionhardware used the exact same benchmark in their reviews. All benches have some margin of error BUT I've eliminated these anomalies to give you consistent results on mine. :)
 
Last edited:

AzN

Banned
Nov 26, 2001
4,112
2
0
EDIT 2: I finally went and tested it with physx off and you were right that is a quite a bit faster even in the minimums. my cpu at 1.8 still got its but kicked by my cpu at stock though. I was basically pegged at the games framerate cap the whole time with my cpu at 3.16 while it would get in the mid 30s easily with the cpu at 1.8. it was completely playable with physx off but still the cpu is pretty significant in this game especially if someone wanted to maybe use vsync. that sounds about right too because it uses the Unreal Engine 3. while using my 5000 X2 in UT3 I got in the low 30s for minimums regardless of whether using an 8600gt, 4670, or 9600gt. of course the averages were much higher and the max went way up with the faster cards with the framerate cap off but the minimums were always within 2-3 fps of each other in that UT3. that still proves my point that in many games the 5850 isnt going to help the minimum framerate at all when using a 5600 X2.

1920x1080 highest settings 4x AA physx off
cpu 3.16 gpu 620/1296/2160
Min, Max, Avg
59, 63, 61.541

cpu 1.8 gpu 620/1296/2160
Min, Max, Avg
35, 63, 50.241

Again your bench seems to have errors.

It was only 24% slower on minimum frame rates yesterday now you it's 41% slower today. While 26% slower in average frame rates yesterday and only 19% slower today. It's not consistent. Whatever it is I could play just fine with my CPU @ 2.2ghz with my faster GTX260.