"GeForce 8800 Needs the Fastest CPU"

brikis98

Diamond Member
Jul 5, 2005
7,253
8
0
link

cliffs: the nvidia 8800 series, possibly because of their dx10 streaming processor design, see tangible benefits from faster CPU's, even at resolutions up to 2048x1536.

discuss. :)
 

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
Doom 3 numbers look fishy. Apparently the FX60 can barely break 125fps on Doom 3 with any nVidia card, but can get 131fps with the X1950XTX, while Intel gets its highest frame rates with nVidia cards, and the X1950 is nowhere close.
 

coldpower27

Golden Member
Jul 18, 2004
1,677
0
76
Originally posted by: Lonyo
Doom 3 numbers look fishy. Apparently the FX60 can barely break 125fps on Doom 3 with any nVidia card, but can get 131fps with the X1950XTX, while Intel gets its highest frame rates with nVidia cards, and the X1950 is nowhere close.

I am not sure on that the improvements ATI made in OpenGL applied much better in Quake 4 then they did in Doom 3, while in Doom 3 they got gains, the Quake 4 gains were the ones that were the most significant.

As well if you noticed they are NOT running AA in the benches, and you have to remember ATI excels when running both AA and AF together and Geforce 7 hardware will take a more significant drop when you turn on AA in conjunction with AF. The extra bandwidth of the X1950 XTX is basically going to waste since the workload is too light, and the Radeon architecture doesn't give as high max frame in lighter workloads.

Once you go down to the 4xAA and 8XAF numbers it looks like things are normal again except for the 2560x1600 numbers for the X1950 XTX in both the Intel and AMD platforms which are higher then the 2048x1536 numbers, those are a little strange.
 

ayabe

Diamond Member
Aug 10, 2005
7,449
0
0
I would have loved to have them run the Quad Core against a vanilla C2D under the same conditions, I bet the differences would be minimal to say the least.
 

coldpower27

Golden Member
Jul 18, 2004
1,677
0
76
Originally posted by: ayabe
I would have loved to have them run the Quad Core against a vanilla C2D under the same conditions, I bet the differences would be minimal to say the least.

By the same conditions I assume you mean a Core 2 Quad vs a Core 2 Duo at the same clockspeed? Yes I would agree, a difference shouldn't manifest itself as the tested suite doesn't offer any games that take advantage of Quad Core processors. We know 3D Mark 2006 will get a nice boost though.
 

n7

Elite Member
Jan 4, 2004
21,303
4
81
I'm sorry, but i am taking this review with a huge grain of salt.

Results don't really match what i've seen at other sites too well, & this is Tom's Intel's sponsored FUD Guide we're talking about.
 

avi85

Senior member
Apr 24, 2006
988
0
0
This actually makes perfect sense. if you'll notice, at 2560*1600 there is almost no difference between the 2 procs, the reason for difference is because the 8800GTX is not the bottleneck, just like when reviews compare procs with gaming benchies they lower the res. so that the vid card is not the bottleneck but here the vid card is less of a bottleneck than the proc till you reach the higher resolutions (no news there, we all know that the 8800GTX is an awesomely powerful card which has yet to realize it's full potential)
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Thnx for the feedback OP. Now I know I'm not crazy when I'm constantly saying that in my experience moving from a 4400+@2.8GHz to a 6600@3.6 right now with a 8800GTX made a considerable difference even @1600X1200.
And all the other reviews except Kyle's never tested a Core Duo vs FX just scaling speed of Core duo2 cpus, which are B$ also to my experience...Moving from 2.4 to 3.6 made a difference too, just not that much than moving from 4400+ to a 6600.
 

Conky

Lifer
May 9, 2001
10,709
0
0
Originally posted by: n7
I'm sorry, but i am taking this review with a huge grain of salt.

Results don't really match what i've seen at other sites too well, & this is Tom's Intel's sponsored FUD Guide we're talking about.
Interesting comment, especially since you have an Intel CPU in your sig.

The results look about right to me.

I have seen Tom's accused of favoring every brand of videocard chipset and CPU since forever. Tom's does tend to favor the latest and greatest which right now is the Intel X6800 and the Nvidia 8800GTX which even a blind man can see. :p
 

the Chase

Golden Member
Sep 22, 2005
1,403
0
0
Originally posted by: n7
I'm sorry, but i am taking this review with a huge grain of salt.

Results don't really match what i've seen at other sites too well, & this is Tom's Intel's sponsored FUD Guide we're talking about.

I agree. That article was a pile of misleading garbage that you couldn't follow if you decoded secret documents for the CIA.
 

Captante

Lifer
Oct 20, 2003
30,282
10,788
136
I used to really like Toms years ago they've gone steadily downhill & their recent articles border on disorganized nonsense... too bad.
 

cmrmrc

Senior member
Jun 27, 2005
334
0
0
I know that the C2X is faster than the FX-60 in gaming...but i didn't thought that the difference would be 130fps vs 230fps....something is not right on those numbers....something around 180fps would be more believable....
 

quattro1

Member
Jan 13, 2005
111
0
0
Originally posted by: coldpower27
Originally posted by: Lonyo
Once you go down to the 4xAA and 8XAF numbers it looks like things are normal again except for the 2560x1600 numbers for the X1950 XTX in both the Intel and AMD platforms which are higher then the 2048x1536 numbers, those are a little strange.

Thats because X1950XTX falls back to 2xAA at 2560x1600 on OpenGL games. How ATI gets away with this "performance boost" is beyond me.
 

Pugnate

Senior member
Jun 25, 2006
690
0
0
Wow that was fairly useless. I say this because they are doing an article on how the 8800GTX performs better with increasing processor power, and yet only used two processors. That is dumb.

Why not do an article showing the 8800GTX going through multiple benchmarks with various processors? That would really hit home.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: Beachboy
Originally posted by: hemmy
tomshardware is the worst site in existence

look at these numbers: http://www.firingsquad.com/hardware/gef...800_gtx_gts_amd_cpu_scaling/page11.asp

almost no diff from fx-62 down to x2-3800 at high res
And no Intel X6800 or any other Core 2 Duo either which was sorta the point of the THG article. :roll:

AMD cpu's are no longer "the Fastest CPU's".

Guru3d did a similar test and pretty much showed very little to no difference between an x6800 (at 3.47) and an x6300 at stock. I'd say a mid-range A64 is equal to an x6300 at stock.
 

Conky

Lifer
May 9, 2001
10,709
0
0
Originally posted by: deadseasquirrel
Originally posted by: Beachboy
Originally posted by: hemmy
tomshardware is the worst site in existence

look at these numbers: http://www.firingsquad.com/hardware/gef...800_gtx_gts_amd_cpu_scaling/page11.asp

almost no diff from fx-62 down to x2-3800 at high res
And no Intel X6800 or any other Core 2 Duo either which was sorta the point of the THG article. :roll:

AMD cpu's are no longer "the Fastest CPU's".

Guru3d did a similar test and pretty much showed very little to no difference between an x6800 (at 3.47) and an x6300 at stock. I'd say a mid-range A64 is equal to an x6300 at stock.

Here's a direct quote from the Guru3D link you just posted.

So to the question is the GeForce 8800 series CPU bound ? Yes, it most certainly is.
Apparently, the author of that article disagrees with you. :p
 

coldpower27

Golden Member
Jul 18, 2004
1,677
0
76
Originally posted by: n7
I'm sorry, but i am taking this review with a huge grain of salt.

Results don't really match what i've seen at other sites too well, & this is Tom's Intel's sponsored FUD Guide we're talking about.

As long as the results jive well with other benches seen around the internet I have no problem with them.

Tom's is an allright site, and is one of the oldest around, not the absolute best or anything anymore, but semi-decent.
 

coldpower27

Golden Member
Jul 18, 2004
1,677
0
76
Originally posted by: quattro1
Originally posted by: coldpower27
Once you go down to the 4xAA and 8XAF numbers it looks like things are normal again except for the 2560x1600 numbers for the X1950 XTX in both the Intel and AMD platforms which are higher then the 2048x1536 numbers, those are a little strange.

Thats because X1950XTX falls back to 2xAA at 2560x1600 on OpenGL games. How ATI gets away with this "performance boost" is beyond me.

Oh, that really interesting, thanks for telling me about that, now that is 1 mystery solved.
 

coldpower27

Golden Member
Jul 18, 2004
1,677
0
76
Originally posted by: deadseasquirrel
Originally posted by: Beachboy
Originally posted by: hemmy
tomshardware is the worst site in existence

look at these numbers: http://www.firingsquad.com/hardware/gef...800_gtx_gts_amd_cpu_scaling/page11.asp

almost no diff from fx-62 down to x2-3800 at high res
And no Intel X6800 or any other Core 2 Duo either which was sorta the point of the THG article. :roll:

AMD cpu's are no longer "the Fastest CPU's".

Guru3d did a similar test and pretty much showed very little to no difference between an x6800 (at 3.47) and an x6300 at stock. I'd say a mid-range A64 is equal to an x6300 at stock.

The first game Prey is especially GPU bound and as the article says obviously it doesn't really matter what CPU you use as you are GPU bound. :p

The second game FarCry, shows some nice differences between the E6300 to the X6800 up till at least 16x12, and if you correlate to the numbers over at Toms Hardware.

It's too bad the article didn't use an E6400 as that would correlate pretty well to the FX-60 used by Tom's.

At 16x12 AA and AF applied figures between FX60 and X6800.

Doom 3
108.4 and 123.3 FPS

F.E.A.R.
79 and 83 FPS

It seems about right.
 

the Chase

Golden Member
Sep 22, 2005
1,403
0
0
And why not use the FX-62 when you are using the X6800 as the competition? I know the Core 2 Duo CPU's are faster than anything AMD has for the "lower" resolutions but this article tries to paint a picture that is not the whole picture. And they use the FX-60 and you can't compare the graphs hardly at all because they're all over the place and the difference in fps is minimal at the top res. (with the FX-60 winning in a spot or 2). The conclusion they come up with and how they state it makes me think they have some other interest other than delivering an honest assessment of CPU importance in gaming.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: Beachboy
Originally posted by: deadseasquirrel
Originally posted by: Beachboy
Originally posted by: hemmy
tomshardware is the worst site in existence

look at these numbers: http://www.firingsquad.com/hardware/gef...800_gtx_gts_amd_cpu_scaling/page11.asp

almost no diff from fx-62 down to x2-3800 at high res
And no Intel X6800 or any other Core 2 Duo either which was sorta the point of the THG article. :roll:

AMD cpu's are no longer "the Fastest CPU's".

Guru3d did a similar test and pretty much showed very little to no difference between an x6800 (at 3.47) and an x6300 at stock. I'd say a mid-range A64 is equal to an x6300 at stock.

Here's a direct quote from the Guru3D link you just posted.

So to the question is the GeForce 8800 series CPU bound ? Yes, it most certainly is.
Apparently, the author of that article disagrees with you. :p

Actually, the author disagrees with his own findings. I, on the other hand, am just looking at the numbers. And, according to those numbers, at 1600x1200, I see no reason to believe a user needs the fastest CPU in order to push a G80. All games are CPU-bound at 1024x768. We really don't need an article to tell us this. If you're gaming at 1024x768 and are looking to buy a G80, my advice is to buy a new display first.