8800gtx bottlenecking

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I noticed that theres no difference in fps betwween these 2 cpu's when running at high resolution with the 8800gtx. So is it worth upgrading to a core 2 duo if you have a amd x2 at 2.6 or 2.8? This could save alot of people money. Another question, would 2 8800gtx's be the bottleneck with a AMD x2 or core 2 duo?

some benchmarks

http://www.anandtech.com/video/showdoc.aspx?i=2870&p=22

http://www.tomshardware.com/2006/11/08/geforce_8800/page8.html

for F.E.A.R.

Edit: So the real question is, Does the AMD X2 push the 8800gtx as far as a core 2?
If so why upgrade?
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
Originally posted by: happy medium
I noticed that theres no difference in fps betwween these 2 cpu's when running at high resolution with the 8800gtx. So is it worth upgrading to a core 2 duo if you have a amd x2 at 2.6 or 2.8? This could save alot of people money. Another question, would 2 8800gtx's be the bottleneck with a AMD x2 or core 2 duo?

some benchmarks

http://www.anandtech.com/video/showdoc.aspx?i=2870&p=22

http://www.tomshardware.com/2006/11/08/geforce_8800/page8.html

for F.E.A.R.

Edit: So the real question is, Does the AMD X2 push the 8800gtx as far as a core 2?
If so why upgrade?

At anything pretty much >1280x1024 the GPU starts to become the limiting factor.

Guru3d review Thats pretty much sums up CPU scaling. Performance is the same whether you have a E6400 (mislabeled in charts) or a X6800. Now those are all Conroe chips.

You can imply that a there wouldn't be any difference with a FX-60 as well.

And with SLI it's the same thing. You become GPU limited at higher resolutions (which would be the only reason in getting a 8800GTX/SLI)
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Originally posted by: happy medium
I noticed that theres no difference in fps betwween these 2 cpu's when running at high resolution with the 8800gtx. So is it worth upgrading to a core 2 duo if you have a amd x2 at 2.6 or 2.8? This could save alot of people money. Another question, would 2 8800gtx's be the bottleneck with a AMD x2 or core 2 duo?

some benchmarks

http://www.anandtech.com/video/showdoc.aspx?i=2870&p=22

http://www.tomshardware.com/2006/11/08/geforce_8800/page8.html

for F.E.A.R.

Edit: So the real question is, Does the AMD X2 push the 8800gtx as far as a core 2?
If so why upgrade?


Why upgrade? For those tasks that don't require a super fast GPU. IE encoding movies, music, productivity, photoshop, etc. For gaming though, the main factor is the GPU, and so long as you have a decently fast CPU the GPU will always be the bottleneck :)
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
So for gaming right now you really don't need a Core duo? The Amd x2 is just as good?
So someone with a X2 who's satisfied with their encoding,photoshop, ect. performance won't really see a benifit? How about people like me , can I save my $ram$ buy a cheap Amd x2 board and cpu and game with the big boys with a 8800gtx? Ram is kinda exspensive!
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Old news.

A celeron D is just as good as X6800 when at anything at or above 12x10 with AA/AF.

Save your cash for video, where it really counts.

During processor tests on games at these crappy 10x7 resolutions you should look at FPS, most ANY PROCESSOR can do above 60 which is good enough for any game out and what it will do at 19x12 assuming your graphics card will keep up. Most dont.
 

thegimp03

Diamond Member
Jul 5, 2004
7,420
2
81
Games are just starting to support multiple processors on a wider scale so as long as you have some kind of dual core processor in the future you should be absolutely fine.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: Zebo
Old news.

A celeron D is just as good as X6800 when at anything at or above 12x10 with AA/AF.

Save your cash for video, where it really counts.

During processor tests on games at these crappy 10x7 resolutions you should look at FPS, most ANY PROCESSOR can do above 60 which is good enough for any game out and what it will do at 19x12 assuming your graphics card will keep up. Most dont.


Well that is only half true.. Anandtech benches with CF back when reviewing the Core duo showed that w/o filters even@1600x1200 you see important differences from A64 dual cores..
But if you use filters then even @1280x1024 gpu is becoming the limiting factor.
Now with G80 things are becoming much clearer since G80 is a step forward to what we call GPGPU.. Which means that tasks that previously were done by cpu now are made inside the gpu..Which explains why G80 is not dependant on cpu to stretch its wings..

Nevertheless I agree with the majority here. If gaming especially with G80 is your primary objective then stay with your A64x2.. You should be more than fine..
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
important differences from A64 dual cores..

You may indeed see differences. 189 FPS vs 140 FPS. Both are beyond playable. What's important and what you can't show is a benchmark with a better processor making a game playable vs. previously unplayable. I can show you that with the very next video card up in a series.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: Zebo
important differences from A64 dual cores..

You may indeed see differences. 189 FPS vs 140 FPS. Both are beyond playable. What's important and what you can't show is a benchmark with a better processor making a game playable vs. previously unplayable. I can show you that with the very next video card up in a series.

I know and I don't doubt that ;)
I was just mentioning it..
 

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
Isn't it known by now ?

Any CPU currently available on the market will bottleneck a 8800 GTX.
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
Originally posted by: Zenoth
Isn't it known by now ?

Any CPU currently available on the market will bottleneck a 8800 GTX.

Are you being sarcastic?
Cause the 8800GTX is the bottleneck, as long as the resolution is high enough.
 

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
No, it's not sarcasm, it's known by now, even the best of the Core 2 Duo series won't do it for any 8800 GTX's "full potential".

Just compare scores between 8800 GTX under AMD FX's and Intel's Dual-Cores from lesser models to extreme models. It's going up and up each time you change CPU architecture/speed, there is no 8800 GTX "performance wall" so far reached, there is yet power to be discovered via more powerful CPU's than anything we can buy today.

I've seen a lot of benchmarks with CrossFire 1900/1950's for example that the scores wouldn't go higher much even if you had a better CPU, sometimes going up by 50 or 60 points in the CPU section of the benchmark. But in the 8800 GTX's case, at least as far as lower resolutions goes 1024 x 768, 1280 x 960 and similar ones for example, the scores in the graphics parts of the benchmark will still go up and up with each CPU change. That means the performance wall for the GTX has yet to be reached.
 

UltimaBoB

Member
Jul 20, 2006
154
0
0
does anyone lay out $2000+ for 8800GTX SLI and a E6800 so that they can go from 200fps to 300+fps at 1024 x 768 rez gaming?

if i understand correctly, yeah, ifyou drop rez low enough you can make the cpu the bottleneck for 8800GTX - maybe even some DX9 cards too. But I don't think that is the point people are making here.
 

cmrmrc

Senior member
Jun 27, 2005
334
0
0
Originally posted by: Zebo
OP Check this out
http://xbitlabs.com/articles/cpu/display/cpu-games2_4.html


I bet you did'nt know a celeron 326 = FX 57

Here is where your processor (or just about any CPU) is compared to the X6800

http://enthusiast.hardocp.com/article.html?art=MTEwOCwxLCxoZW50aHVzaWFzdA==

im not saying that you're wrong, yes its true at 16X12 4XAA 16XAF, 326=X6800 in certain games, but in games that are a little more cpu dependant, i doubt the 326=X6800. look at BF2 and SS2
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Originally posted by: Zebo
Old news.

A celeron D is just as good as X6800 when at anything at or above 12x10 with AA/AF.

Save your cash for video, where it really counts.

During processor tests on games at these crappy 10x7 resolutions you should look at FPS, most ANY PROCESSOR can do above 60 which is good enough for any game out and what it will do at 19x12 assuming your graphics card will keep up. Most dont.


I think this sums it up!

 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
Originally posted by: Zenoth
No, it's not sarcasm, it's known by now, even the best of the Core 2 Duo series won't do it for any 8800 GTX's "full potential".

Just compare scores between 8800 GTX under AMD FX's and Intel's Dual-Cores from lesser models to extreme models. It's going up and up each time you change CPU architecture/speed, there is no 8800 GTX "performance wall" so far reached, there is yet power to be discovered via more powerful CPU's than anything we can buy today.

I've seen a lot of benchmarks with CrossFire 1900/1950's for example that the scores wouldn't go higher much even if you had a better CPU, sometimes going up by 50 or 60 points in the CPU section of the benchmark. But in the 8800 GTX's case, at least as far as lower resolutions goes 1024 x 768, 1280 x 960 and similar ones for example, the scores in the graphics parts of the benchmark will still go up and up with each CPU change. That means the performance wall for the GTX has yet to be reached.

Yea, maby at 1024x768.
Did you even click the link I posted before?
Here it is:
http://www.guru3d.com/article/Videocards/391/23

Now, click it and what do you see??
If it's the G80 is the limiting factor at resolutions at or greater then 1280x960 then your right.

But who the hell spends all that money for a G80 and doesn't run it at 1280x960 or greater.