Video Card Decision (**Update**)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: josh6079
@ toyota - Those are interesting results, but considering that ETQW is the worst my rig is going to see (rather than Crysis or FC2) would overclocking my CPU still be as important?

I just hate to go through the process of overclocking only to have some applications' failures be due to their "sensitivity" to those overclocks. But, if it could double my minimum frame rate...that might be worth looking into. Problem is I haven't done anything of the sort in over two years and when I did it it was on an AMD.

well if you go with a 4850 and play games like Quake 4 and ETQW then overclocking the cpu will not really be needed.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
Your results in Crysis show less than 10% performance difference between 2.13ghz vs 3.16ghz. Your minimum frames in Crysis looks like an error than anything else.

Far Cry 2 is one of those games that is Quad optimized and benefit from faster CPU performance. Might as well benchmark GTA 4 while you are it.

ran benchmarks 4 times each. that is actually about right for Crysis because my 5000 X2 got 12 fps for the minimum with those same settings.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: toyota
Originally posted by: Azn
Your results in Crysis show less than 10% performance difference between 2.13ghz vs 3.16ghz. Your minimum frames in Crysis looks like an error than anything else.

Far Cry 2 is one of those games that is Quad optimized and benefit from faster CPU performance. Might as well benchmark GTA 4 while you are it.

ran benchmarks 4 times each. that is actually about right for Crysis because my 5000 X2 got 12 fps for the minimum with those same settings.

That's funny I just did some benchmarks of my own that's not even remotely what kind of numbers I got.

Your minimum frames in Crysis doesn't make much sense. At 3.15ghz you got 34.69fps but at 2.13ghz you got 16.56fps. Not that's more than 50% better frame rates over 33% less clock speed. Obvious error or hard drive thrashing.

My results. I used fraps. 1440x900 dx9 high to put less stress on GPU compared to your 1680x1050 high on GTX 260 since my card is only 8800gts.

My CPU @ 3.22ghz
Average frame rates 46.39fps
minimum 34fps max 57fps

@ 2.22ghz
average frame rates 39.96fps
minimum 28fps max 51fps.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
Originally posted by: toyota
Originally posted by: Azn
Your results in Crysis show less than 10% performance difference between 2.13ghz vs 3.16ghz. Your minimum frames in Crysis looks like an error than anything else.

Far Cry 2 is one of those games that is Quad optimized and benefit from faster CPU performance. Might as well benchmark GTA 4 while you are it.

ran benchmarks 4 times each. that is actually about right for Crysis because my 5000 X2 got 12 fps for the minimum with those same settings.

That's funny I just did some benchmarks of my own that's not even remotely what kind of numbers I got.

Your minimum frames in Crysis doesn't make much sense. At 3.15ghz you 34.69fps but at 2.13ghz you got 16.56fps. Not that's more than 50% better frame rates over 33% less clock speed. Obvious error.

My results. I used fraps.

My CPU @ 3.22ghz
Average frame rates 46.39fps
minimum 34fps max 57fps

@ 2.22ghz
average frame rates 39.96fps
minimum 28fps max 51fps.

well look here at their minimums http://techreport.com/articles.x/16382/4

theres a Phenom 9950 and 8750 getting 17 and 19 fps. even the Q8200 is at 20fps minimum so my numbers are right on and its yours that dont make sense compared to what techreport and I got.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: toyota
Originally posted by: Azn
Originally posted by: toyota
Originally posted by: Azn
Your results in Crysis show less than 10% performance difference between 2.13ghz vs 3.16ghz. Your minimum frames in Crysis looks like an error than anything else.

Far Cry 2 is one of those games that is Quad optimized and benefit from faster CPU performance. Might as well benchmark GTA 4 while you are it.

ran benchmarks 4 times each. that is actually about right for Crysis because my 5000 X2 got 12 fps for the minimum with those same settings.

That's funny I just did some benchmarks of my own that's not even remotely what kind of numbers I got.

Your minimum frames in Crysis doesn't make much sense. At 3.15ghz you 34.69fps but at 2.13ghz you got 16.56fps. Not that's more than 50% better frame rates over 33% less clock speed. Obvious error.

My results. I used fraps.

My CPU @ 3.22ghz
Average frame rates 46.39fps
minimum 34fps max 57fps

@ 2.22ghz
average frame rates 39.96fps
minimum 28fps max 51fps.

well look here at their minimums http://techreport.com/articles.x/16382/4

theres a Phenom 9950 and 8750 getting 17 and 19 fps. even the Q8200 is at 20fps minimum so my numbers are right on and its yours that dont make sense compared to what techreport and I got.

WTF are you smoking? Give me some!

The techreport link tests were done at 1024x768 medium settings. I don't know why you are trying compare 1680x1050 high settings and my 8800gts 1440x900 high settings.

Why are you comparing AMD vs Intel platform?
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
Originally posted by: toyota
Originally posted by: Azn
Originally posted by: toyota
Originally posted by: Azn
Your results in Crysis show less than 10% performance difference between 2.13ghz vs 3.16ghz. Your minimum frames in Crysis looks like an error than anything else.

Far Cry 2 is one of those games that is Quad optimized and benefit from faster CPU performance. Might as well benchmark GTA 4 while you are it.

ran benchmarks 4 times each. that is actually about right for Crysis because my 5000 X2 got 12 fps for the minimum with those same settings.

That's funny I just did some benchmarks of my own that's not even remotely what kind of numbers I got.

Your minimum frames in Crysis doesn't make much sense. At 3.15ghz you 34.69fps but at 2.13ghz you got 16.56fps. Not that's more than 50% better frame rates over 33% less clock speed. Obvious error.

My results. I used fraps.

My CPU @ 3.22ghz
Average frame rates 46.39fps
minimum 34fps max 57fps

@ 2.22ghz
average frame rates 39.96fps
minimum 28fps max 51fps.

well look here at their minimums http://techreport.com/articles.x/16382/4

theres a Phenom 9950 and 8750 getting 17 and 19 fps. even the Q8200 is at 20fps minimum so my numbers are right on and its yours that dont make sense compared to what techreport and I got.

WTF are you smoking? Give me some!

The techreport link tests were done at 1024x768 medium settings. I don't know why you are trying compare 1680x1050 high settings and my 8800gts 1440x900 high settings.

Why are you comparing AMD vs Intel platform?

I was showing you that the minimums can easily get that low even with faster cpus than mine at 2.13. so do you think the minimums would be higher at 1680 and on high settings? if you do then its you that are smoking something. 16 seems right in line considering I got 12 fps minimum at the same settings with the 5000 X2 so I dont know what to tell you.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: toyota
I was showing you that the minimums can easily get that low. so do you think the minimums would be higher at 1680? if you do then its you that are smoking something. 16 seems right in line considering I got 12 fps minimum at the same settings with the 5000 X2 so I dont know what to tell you.

That's quite silly because that was my point from the very start. Nobody plays Crysis on medium settings @ 1024x768 on a 4870.

I don't give a shit how your x2 performed on a 8600gt. Go ahead do the benchmark again with your E8500 downclocked and measure the minimum fps again. Make sure you don't get hard drive thrashing while you are benchmarking. This could off set the minimum frame rates like your previous results.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
Originally posted by: toyota
I was showing you that the minimums can easily get that low. so do you think the minimums would be higher at 1680? if you do then its you that are smoking something. 16 seems right in line considering I got 12 fps minimum at the same settings with the 5000 X2 so I dont know what to tell you.

That's quite silly because that was my point from the very start. Nobody plays Crysis on medium settings @ 1024x768 on a 4870.

I don't give a shit how your x2 performed on a 8600gt. Go ahead do the benchmark again with your E8500 downclocked and measure the minimum fps again. Make sure you don't get hard drive thrashing while you are benchmarking. This could off set the minimum frame rates like your previous results.

the POINT was that techreport got about the same miniumums even with a faster cpu and at lower settings so why is it so hard to understated that 16 fps is about right on high settings at 1680? also the 12 fps was with the gtx260 in my 5000 X2 system for one day while I looked at minimum framerates in some game to compare things to my setup.

I ran the benchmarks 4 times but i will go back and do it again.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: toyota
Originally posted by: Azn
Originally posted by: toyota
I was showing you that the minimums can easily get that low. so do you think the minimums would be higher at 1680? if you do then its you that are smoking something. 16 seems right in line considering I got 12 fps minimum at the same settings with the 5000 X2 so I dont know what to tell you.

That's quite silly because that was my point from the very start. Nobody plays Crysis on medium settings @ 1024x768 on a 4870.

I don't give a shit how your x2 performed on a 8600gt. Go ahead do the benchmark again with your E8500 downclocked and measure the minimum fps again. Make sure you don't get hard drive thrashing while you are benchmarking. This could off set the minimum frame rates like your previous results.

the POINT was that techreport got about the same miniumums even with a faster cpu and at lower settings so why is it so hard to understated that 16 fps is about right on high settings at 1680? also the 12 fps was with the gtx260 in my 5000 X2 system for one day while I looked at minimum framerates in some game to compare things to my setup.

I ran the benchmarks 4 times but i will go back and do it again.

Do you even read what you type? You just said techreport got same minimums even with faster CPU. So what does that mean to your previous benchmarks when your E8500 clocked at 2.13ghz got less than 1/2 the frame rates @ 3.16ghz? Obvious error.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: josh6079
@ toyota - Those are interesting results, but considering that ETQW is the worst my rig is going to see (rather than Crysis or FC2) would overclocking my CPU still be as important?

I just hate to go through the process of overclocking only to have some applications' failures be due to their "sensitivity" to those overclocks. But, if it could double my minimum frame rate...that might be worth looking into. Problem is I haven't done anything of the sort in over two years and when I did it it was on an AMD.

Those benches had errors. overclocking intel CPU's are relatively safe. You have to torture test for stability.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
Originally posted by: toyota
Originally posted by: Azn
Originally posted by: toyota
I was showing you that the minimums can easily get that low. so do you think the minimums would be higher at 1680? if you do then its you that are smoking something. 16 seems right in line considering I got 12 fps minimum at the same settings with the 5000 X2 so I dont know what to tell you.

That's quite silly because that was my point from the very start. Nobody plays Crysis on medium settings @ 1024x768 on a 4870.

I don't give a shit how your x2 performed on a 8600gt. Go ahead do the benchmark again with your E8500 downclocked and measure the minimum fps again. Make sure you don't get hard drive thrashing while you are benchmarking. This could off set the minimum frame rates like your previous results.

the POINT was that techreport got about the same miniumums even with a faster cpu and at lower settings so why is it so hard to understated that 16 fps is about right on high settings at 1680? also the 12 fps was with the gtx260 in my 5000 X2 system for one day while I looked at minimum framerates in some game to compare things to my setup.

I ran the benchmarks 4 times but i will go back and do it again.

Do you even read what you type? You just said techreport got same minimums even with faster CPU. So what does that mean to your previous benchmarks when your E8500 clocked at 2.13ghz got less than 1/2 the frame rates @ 3.16ghz? Obvious error.
are you drunk? yes I said they got about the same minimums. they got 20fps with a faster clocked 8300 2.33 on med and at 1024x768 and I got 16 fps with my cpu at 2.13 on high settings at 1680. 16 fps sound perfectly right after looking at what they got with a faster cpu at lower settings.


I changed my cpu multiplier to 6 to get 2.00 and guess what? I got 12.83 fps mimumum which is what I got with a 5000 X2 so there you go.

1680x1050 all high DX9

!TimeDemo Run 0 Finished.
Play Time: 51.45s, Average FPS: 38.87
Min FPS: 12.83 at frame 148, Max FPS: 55.67 at frame 900
Average Tri/Sec: -37819416, Tri/Frame: -972861
Recorded/Played Tris ratio: -0.94
!TimeDemo Run 1 Finished.
Play Time: 43.79s, Average FPS: 45.67
Min FPS: 12.83 at frame 148, Max FPS: 60.53 at frame 71
Average Tri/Sec: -43892008, Tri/Frame: -961026
Recorded/Played Tris ratio: -0.95
TimeDemo Play Ended, (2 Runs Performed)

 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: toyota
are you drunk? yes I said they got about the same minimums. they got 20fps with a faster clocked 8300 2.33 on med and at 1024x768 and I got 16 fps with my cpu at 2.13 on high settings at 1680. 16 fps sound perfectly right after looking at what they got with a faster cpu at lower settings.


I changed my cpu multiplier to 6 to get 2.00 and guess what? I got 12.83 fps mimumum which is what I got with a 5000 X2 so there you go.

1680x1050 all high DX9

!TimeDemo Run 0 Finished.
Play Time: 51.45s, Average FPS: 38.87
Min FPS: 12.83 at frame 148, Max FPS: 55.67 at frame 900
Average Tri/Sec: -37819416, Tri/Frame: -972861
Recorded/Played Tris ratio: -0.94
!TimeDemo Run 1 Finished.
Play Time: 43.79s, Average FPS: 45.67
Min FPS: 12.83 at frame 148, Max FPS: 60.53 at frame 71
Average Tri/Sec: -43892008, Tri/Frame: -961026
Recorded/Played Tris ratio: -0.95
TimeDemo Play Ended, (2 Runs Performed)

You obviously don't know how to benchmark properly and rely on one of those automatic benchmarking utilities to get results. No wonder you always come to the forums how your 8600gt should be able to play high setting if you had faster CPU. LOL.

You keep wanting to compare downlclocked E8500 to a x2 5000. I don't know what ever for. You had a different card back then so comparing results would be a moot point.

So you ran it 6x multiplier which is 2.0ghz instead of 2.13ghz. Okay fine. Your minimum frames is exactly 12.83fps in both of your runs? To the exact hundredth of a frame rate? That's quite an accomplishment there or you are a terrible liar.

Your benchmarks do not make sense.

@ 3.15ghz your minimum frame rate is 34fps while 30% less clocks @ 2.13ghz gives you 16fps. That's 50% better minimum frame rates. Now at 2ghz it gives you 12.83fps. You probably got that 12.83fps when you benchmarked 8600gt with your x2 5000. Pathetic. You don't learn anything this way when all you want to do is prove yourself by making up numbers. :disgust:
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
Originally posted by: toyota
are you drunk? yes I said they got about the same minimums. they got 20fps with a faster clocked 8300 2.33 on med and at 1024x768 and I got 16 fps with my cpu at 2.13 on high settings at 1680. 16 fps sound perfectly right after looking at what they got with a faster cpu at lower settings.


I changed my cpu multiplier to 6 to get 2.00 and guess what? I got 12.83 fps mimumum which is what I got with a 5000 X2 so there you go.

1680x1050 all high DX9

!TimeDemo Run 0 Finished.
Play Time: 51.45s, Average FPS: 38.87
Min FPS: 12.83 at frame 148, Max FPS: 55.67 at frame 900
Average Tri/Sec: -37819416, Tri/Frame: -972861
Recorded/Played Tris ratio: -0.94
!TimeDemo Run 1 Finished.
Play Time: 43.79s, Average FPS: 45.67
Min FPS: 12.83 at frame 148, Max FPS: 60.53 at frame 71
Average Tri/Sec: -43892008, Tri/Frame: -961026
Recorded/Played Tris ratio: -0.95
TimeDemo Play Ended, (2 Runs Performed)

You obviously don't know how to benchmark properly and rely on one of those automatic benchmarking utilities to get results. No wonder you always come to the forums how your 8600gt should be able to play high setting if you had faster CPU. LOL.

You keep wanting to compare downlclocked E8500 to a x2 5000. I don't know what ever for. You had a different card back then so comparing results would be a moot point.

So you ran it 6x multiplier which is 2.0ghz instead of 2.13ghz. Okay fine. Your minimum frames is exactly 12.83fps in both of your runs? To the exact hundredth of a frame rate? That's quite an accomplishment there or you are a terrible liar.

Your benchmarks do not make sense.

@ 3.15ghz your minimum frame rate is 34fps while 30% less clocks @ 2.13ghz gives you 16fps. That's 50% better minimum frame rates. Now at 2ghz it gives you 12.83fps. You probably got that 12.83fps when you benchmarked 8600gt with your x2 5000. Pathetic. You don't learn anything this way when all you want to do is prove yourself by making up numbers. :disgust:

thats what I got so deal with it. you know that isnt what i said about the 8600gt but you still continue to spred that BS. you are a pathetic and petty individual that likes to start trouble and call people liars. I have no reason to lie about any of this stuff. at first I was surprised to see such low numbers and then i remembered that the 5000 X2 did not crack 13 fps either. oh and I already told you that the numbers I had on the 5000 X2 were also with the gtx260. I then went to techreport because they always list minimums in their reviews. perhaps you will see me one of those insulting pms like you did last time you didnt get the response you wanted. http://img212.imageshack.us/my...?image=11581240sy6.jpg
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: toyota
thats what I got so deal with it. I already told you that the numbers I had on the 5000 X2 were also with the gtx260. you are a pathetic and petty individual that likes to start trouble and call people liars. I have no reason to lie about any of this stuff. at first I was surprised to see such low numbers and then i remembered that the 5000 X2 did not crack 13 fps either. I then went to techreport because they always list minimums in their reviews. perhaps you will see me one of those insulting pms like you did last time you didnt get the response you wanted.

If it doesn't add up it doesn't take a genius the numbers are off. 30% less clocks does not equate to 50% better minimum frame rates. You must understand simple math. If there was absolutely no stress on the GPU at best it could do is 30% better minimum frame rates.

Getting exact same minimum frame rates on both of your runs? Now that puts the icing on the cake.

Pathetic and petty individual is when you lie about your benchmarks just to prove yourself on the internet.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I copy and pasted the results so i dont know what to tell you. If a was a liar like you think I am then i would have calculated it out and just made up the numbers. I guess techreport lied too getting 20fps with a 2.33 Q8200 even on medium settings.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: toyota
I copy and pasted the results so i dont know what to tell you. If a was a liar like you think I am then i would have calculated it out and just made up the numbers. I guess techreport lied to getting 20fps with a 2.33 Q8200.

Exact same minimum frame rate of 12.83fps on both of your runs? I never had this happen in my life time. the first run would be all screwy anyway because it's hard drive thrashing. The second run should be a lot more smoother. if anything your first run should have much worse minimum frame rates than your second run. But on the contrary. You had exact same results on both of your runs.

You might not be lying but you obviously don't know how to benchmark properly.

30% higher clock rates does not equate to 100% better minimum frame rates. Does that even make sense to you?
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
Originally posted by: toyota
I copy and pasted the results so i dont know what to tell you. If a was a liar like you think I am then i would have calculated it out and just made up the numbers. I guess techreport lied to getting 20fps with a 2.33 Q8200.

Exact same minimum frame rate of 12.83fps on both of your runs? I never had this happen in my life time. the first run would be all screwy anyway because it's hard drive thrashing. The second run should be a lot more smoother. if anything your first run should have much worse minimum frame rates than your second run. But on the contrary. You had exact same results on both of your runs.

You might not be lying but you obviously don't know how to benchmark properly.

30% higher clock rates does not equate to 100% better minimum frame rates. Does that even make sense to you?
does techreport getting 20 fps for a 2.33 core 2 quad for medium settings and me getting 16 fps for a 2.13 core 2 dual at high settings not make sense to you? based on their numbers alone my results look logical. also going back to the 5000 X2 where i got 12 fps seems logical too.



here are the runs for everything copy and pasted. if you would like screenshots of the Crysis benchmarking tool with the results listed let me know.

3.16

!TimeDemo Run 0 Finished.
Play Time: 45.30s, Average FPS: 44.15
Min FPS: 34.69 at frame 138, Max FPS: 58.41 at frame 1003
Average Tri/Sec: -42950616, Tri/Frame: -972784
Recorded/Played Tris ratio: -0.94
!TimeDemo Run 1 Finished.
Play Time: 41.65s, Average FPS: 48.01
Min FPS: 34.69 at frame 138, Max FPS: 61.00 at frame 70
Average Tri/Sec: -46108356, Tri/Frame: -960319
Recorded/Played Tris ratio: -0.95
TimeDemo Play Ended, (2 Runs Performed)

Settings: Demo(Ranch Small), 1680x1050 (60Hz), D3D10, Fixed Time Step(No), Disable Artificial Intelligence(No), Full Screen, Anti-Aliasing(None), VSync(No), Overall Quality(Very High), Vegetation(Very High), Shading(Very High), Terrain(Very High), Geometry(Very High), Post FX(High), Texture(Very High), Shadow(Very High), Ambient(High), Hdr(Yes), Bloom(Yes), Fire(Very High), Physics(Very High), RealTrees(Very High)

Loop 1
Total Frames: 2882, Total Time: 51.00s
Average Framerate: 56.51
Max. Framerate: 90.05 (Frame:469, 7.13s)
Min. Framerate: 39.96 (Frame:1944, 34.66s)


2.13

!TimeDemo Run 0 Finished.
Play Time: 51.05s, Average FPS: 39.18
Min FPS: 16.56 at frame 153, Max FPS: 55.37 at frame 980
Average Tri/Sec: -38111196, Tri/Frame: -972822
Recorded/Played Tris ratio: -0.94
!TimeDemo Run 1 Finished.
Play Time: 44.62s, Average FPS: 44.82
Min FPS: 16.56 at frame 153, Max FPS: 60.23 at frame 75
Average Tri/Sec: -43070804, Tri/Frame: -961008
Recorded/Played Tris ratio: -0.95
TimeDemo Play Ended, (2 Runs Performed)


Settings: Demo(Ranch Small), 1680x1050 (60Hz), D3D10, Fixed Time Step(No), Disable Artificial Intelligence(No), Full Screen, Anti-Aliasing(None), VSync(No), Overall Quality(Very High), Vegetation(Very High), Shading(Very High), Terrain(Very High), Geometry(Very High), Post FX(High), Texture(Very High), Shadow(Very High), Ambient(High), Hdr(Yes), Bloom(Yes), Fire(Very High), Physics(Very High), RealTrees(Very High)
Loop 1
Total Frames: 2085, Total Time: 51.01s
Average Framerate: 40.88
Max. Framerate: 66.81 (Frame:384, 7.39s)
Min. Framerate: 23.77 (Frame:1552, 39.26s)

2.00

!TimeDemo Run 0 Finished.
Play Time: 51.45s, Average FPS: 38.87
Min FPS: 12.83 at frame 148, Max FPS: 55.67 at frame 900
Average Tri/Sec: -37819416, Tri/Frame: -972861
Recorded/Played Tris ratio: -0.94
!TimeDemo Run 1 Finished.
Play Time: 43.79s, Average FPS: 45.67
Min FPS: 12.83 at frame 148, Max FPS: 60.53 at frame 71
Average Tri/Sec: -43892008, Tri/Frame: -961026
Recorded/Played Tris ratio: -0.95
TimeDemo Play Ended, (2 Runs Performed)

 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: toyota
does techreport getting 20 fps at 2.33 quad for medium settings and me getting 16 fps at 2.13 dual for high settings not make sense to you? based on their number alone my results look logical. also going back to the 5000 X2 where i got 12 fps seems logical too.

You want to some how compare your minimum frames with techreport?

Techreport was using 1024x768 medium settings where it stresses the CPU a lot more than Crysis high 1680x1050 which you tested.

The whole fucking testing method was different too and different setup. Techreport used fraps method while you tested timedemo. not even the same benchmark either. ROFL....

 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
Originally posted by: toyota
does techreport getting 20 fps at 2.33 quad for medium settings and me getting 16 fps at 2.13 dual for high settings not make sense to you? based on their number alone my results look logical. also going back to the 5000 X2 where i got 12 fps seems logical too.

You want to some how compare your minimum frames with techreport?

Techreport was using 1024x768 medium settings where it stresses the CPU a lot more than Crysis high 1680x1050 which you tested.

The whole fucking testing method was different too and different setup. Techreport used fraps method while you tested timedemo. not even the same benchmark either. ROFL....
well genius if you get 20fps at 1024x768 then its only to remain the same or go down with high settings and higher res. considering that was a faster clocked quad core at medium settings getting 20 fps then of course a slower clocked dual core cpu on high settings could get easily get just 16 fps.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: toyota
Originally posted by: Azn
Originally posted by: toyota
does techreport getting 20 fps at 2.33 quad for medium settings and me getting 16 fps at 2.13 dual for high settings not make sense to you? based on their number alone my results look logical. also going back to the 5000 X2 where i got 12 fps seems logical too.

You want to some how compare your minimum frames with techreport?

Techreport was using 1024x768 medium settings where it stresses the CPU a lot more than Crysis high 1680x1050 which you tested.

The whole fucking testing method was different too and different setup. Techreport used fraps method while you tested timedemo. not even the same benchmark either. ROFL....
well genius if you get 20fps at 1024x768 then its only to remain the same or go down with high settings and higher res. considering that was a faster clocked quad at medium settings getting 20 fps then of course a slower clocked cpu on high settings could get 16 fps.

You must be freaking einstein to compare different testing methods and settings even different CPU altogether. if what you said remotely made sense I wouldn't even be having this conversation with you. But it doesn't at least not guys with logic and basic math skills.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
Originally posted by: toyota
Originally posted by: Azn
Originally posted by: toyota
does techreport getting 20 fps at 2.33 quad for medium settings and me getting 16 fps at 2.13 dual for high settings not make sense to you? based on their number alone my results look logical. also going back to the 5000 X2 where i got 12 fps seems logical too.

You want to some how compare your minimum frames with techreport?

Techreport was using 1024x768 medium settings where it stresses the CPU a lot more than Crysis high 1680x1050 which you tested.

The whole fucking testing method was different too and different setup. Techreport used fraps method while you tested timedemo. not even the same benchmark either. ROFL....
well genius if you get 20fps at 1024x768 then its only to remain the same or go down with high settings and higher res. considering that was a faster clocked quad at medium settings getting 20 fps then of course a slower clocked cpu on high settings could get 16 fps.

You must be freaking einstein to compare different testing methods and settings even different CPU altogether. if what you said remotely made sense I wouldn't even be having this conversation with you. But it doesn't at least not guys with logic and basic math skills.

well lets try common sense. that quad core is basically the same architecture that my cpu uses so if they can get a minimum framerate of 20 fps on medium settings with a faster clocked quad core cpu then why in the hell cant you comprehend getting just 16 fps on a slower clocked dual core cpu on high settings?
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Let's do simple math using your results instead of using your common sense to compare different settings, resolution, benchmark, testing method, and CPU vs your CPU minimum frame rate results.

3.15ghz
34.69 fps

2.13ghz 32.4% in clock reduction.
16.56 fps 52.3% reduction in minimum frame rates

2.00ghz 36.5% in clock reduction
12.83fps 63.1% reduction in frame rates.

Your results would be like a GTX 280 getting 2x the frame rate because it was overclocked by 30%. Which isn't possible. Your benchmarks are flawed. It's obvious to a person with elementary math skills.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
Let's do simple math using your results instead of using your common sense to compare different settings, resolution, benchmark, testing method, and CPU vs your CPU minimum frame rate results.

3.15ghz
34.69 fps

2.13ghz 32.4% in clock reduction.
16.56 fps 52.3% reduction in minimum frame rates

2.00ghz 36.5% in clock reduction
12.83fps 63.1% reduction in frame rates.

Your results would be like a GTX 280 getting 2x the frame rate because it was overclocked by 30%. Which isn't possible. Your benchmarks are flawed. It's obvious to a person with elementary math skills.
I am NOT disagreeing with you that the numbers look off compared to the clockspeed. all I am saying is that if techreport got those numbers with a better cpu then my numbers arent far fetched. I gave you the numbers at 3 different cpu speeds from the Crysis benchmark that i have no control over.




EDIT: I just ran some fraps benchmarks at the same settings as in the Crysis benchmark and the lowest I got was 20 fps with a 31 fps average with cpu still at 2.00 and 27 minimum and 42 average at stock 3.16 trying to be as consistent as possible. this included a small battle with 6 or 7 Koreans right on the beach. thats still proves that the cpu is VERY important even at 1680 on high settings.