How bottlenecked will this be?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cbn

Lifer
Mar 27, 2009
12,968
221
106
it does shift in those games that I listed. again my point is using a 5850 with a 5600 X2 is that a crap load of performance will be going down the drain and that is a fact.

At what point do you consider additional AA a waste of money? I am just curious because I haven't been able to determine this for myself yet.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
At what point do you consider additional AA a waste of money? I am just curious because I haven't been able to determine this for myself yet.
I will take a setup with better minimum framerates and overall playabilty before I would worry about seeing how much AA I can use because I am cpu bottlenecked. personally I dont use that much AA anyway unless a game looks really jaggy. there are some games like Dead Space where you cant even use normal AA at all which is a little frustrating.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
I don't think anyone said that a 2.8 X2 was 'much slower than a 1.8 C2D'.

What is true though, is that looking at the Anandtech review of the E6300 and E6400, is that the 2.13ghz E6400 w/2MB L2 on 1066FSB was a shade faster then the 2.6ghz X2, and just a hair slower than the 2.8ghz X2. The 4MB C2Ds (E6600 and above) just run away with it, let alone the newer ones with 6MB L2 / 1333FSB (Wolfdale E8xxx's)

Given that the E6300/6400 came out well over three years ago (and they weren't top of the line then, either), the even-older X2 Athlons are pretty worn out to be used for gaming with a new high-end video card.

Toyota is the one who's telling everybody that the X2 at 2.80GHz is slower than a Core 2 Duo at 1.86GHz, which is not true. Its like if the same advantage that the K8 had over the Netburst architecture would be the same advantage that the Conroe had over the K8 which is not true.
if you think his 5600 X2 is remotely as fast as your Q6600@3.2 even in games that dont use more than 2 cores then you are very mistaken. turn off two of your cores and lower your cpu to 2.0 and you will still equal or in most cases outperform a 2.8 5600 X2.

The X2 running at 2.8GHz is quite competitive with any Conroe Core 2 Duo running between 2.16GHz and 2.2GHz depending of the gaming scenario, with the slight edge going to the Core architecture. The Athlon X2 64 aka K8 architecture is slighly less efficient compared to the Conroe architecture, it was much faster than the Netburst architecture in which its Dual Core incarnation running at 3.20Ghz which was able to match the K8 running at 2.0GHz, so how suddenly the K8 which was able to smoke the Netburst architecture now runs much slower than Conroe?

http://www.anandtech.com/showdoc.aspx?i=2802&p=9

In HL2EP1, the Conroe 2.40GHz is slighly faster than the Athlon FX running at 2.80Ghz which is slighly faster than the Core 2 at 2.13GHz, and the Core 2 at 1.86Ghz is slighly faster than the Athlon X2 at 2.40GHz.

In Battlefield 2, The core 2 at 2.40GHz is faster than the FX at 2.80Ghz which is slighly faster than the Core 2 2.13GHz. The Core 2 1.86GHz is slighly faster than the Athlon X2 at 2.60GHz.


"The processor landscape has been changed once more thanks to AMD's extremely aggressive price cuts. The Core 2 Duo E6300 is a better performer than the X2 3800+ but is also more expensive, thankfully for the E6300's sake it is also faster than the 4200+ and the 4600+ in some benchmarks. Overall the E6300 is a better buy, but at stock speeds the advantage isn't nearly as great as the faster Core 2 parts. In many benchmarks the X2 4200+ isn't that far off the E6300's performance, sometimes even outperforming it at virtually the same price. Overclocking changes everything though, as our 2.592GHz E6300 ended up faster than AMD's FX-62 in almost every single benchmark. If you're not an overclocker, then the Athlon 64 X2 4200+ looks to be a competitive alternative to the Core 2 E6300.

The E6400 finds itself in between the X2 4200+ and X2 4600+ in price, but in performance the E6400 generally lands in between the 4600+ and 5000+. Once again, with these 2MB parts the performance advantage isn't nearly as impressive as with the 4MB parts (partly due to the fact that their native clock speed is lower, in addition to the smaller L2 cache), but even with AMD's new price cuts the Core 2 is still very competitive at worst. If you're not opposed to overclocking, then the E6400 can offer you more than you can get from any currently shipping AMD CPU - our chip managed an effortless 2.88GHz overclock which gave us $1000 CPU performance for $224. "

http://www.anandtech.com/showdoc.aspx?i=2795&p=14

In Quake 4, The Core 2 at 2.40GHz is faster than the FX at 2.80GHz by 6fps, and the FX 2.80GHz is faster than the Core 2 Duo 1.86GHz by 20fps which is slighly faster than the Athlon X2 at 2.0GHz.

In Battlefield 2, the same scenario repeats again, the Core 2 at 1.86GHz is slighly faster than the Athlon X2 2.60GHz by 1.5fps.

In HL2 EP 1 same story happens, but this time the Athlon X2 2.60GHz is slighly faster than the Core 2 Duo 1.86GHz.

Etc etc etc, in the end, the Athlon X2 at 2.80GHz offers the performance of a Conroe 2MB CPU running between 2.16Ghz and 2.20GHz.

"We're still waiting to get our hands on the E6400 as it may end up being the best bang for your buck, but even the slower E6300 is quite competitive with AMD's X2 4200+ and X2 3800+. If AMD drops the price on those two parts even more than we're expecting, then it may be able to hold on to the lower end of the performance mainstream market as the E6300 is not nearly as fast as the E6600. "

But in the end, at my opinion, even the Conroe at 2.40GHz can be a serious bottleneck to a HD 5850 regardless. A Wolfdale Core 2 Duo at 3.0Ghz or more is more balanced thanks to it's improved IPC. But the i7 750 is a perfect match for it.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
if you think his 5600 X2 is remotely as fast as your Q6600@3.2 even in games that dont use more than 2 cores then you are very mistaken. turn off two of your cores and lower your cpu to 2.0 and you will still equal or in most cases outperform a 2.8 5600 X2.

What he needs to do is underclock his CPU to 2.0GHz and then bench a 8800GT and a 5850. See how much the 5850 loses and how much he gains from going from a 8800GT to a 5850 with his CPU @2.0 GHz.

In some games the Core2 and the extra cores will give huge improvements over the X2, but in cases where the architecture isn't relevant, with max detail, high resolutions and some level of AA (and I think 4x/8x is generally more than enough), I bet (unfortunately can't test it and finding that kind of that is pain - for some reason when ppl test CPUs they only test it at low resolutions/not max quality/no AA/AF) the difference won't be that huge, and more is gained from the change of a 8800GT to a 5850 than going from a 2.0 GHz to a 3.2 GHz CPU.

Which is what I see when I underclock my X2 from 3.0 to 2.0 and see shit difference in benchmark frame rate (both avg and minimum) at 1650x1080. At lower resolutions I see a bit more.

Underclocking my 4850 from 720/1000 to 540/750 on the other hand, see quite perfect 25% fps drops at higher resolutions. BTW - minimums fps are pretty much the same too at 2.0GHz or at 3.0GHz.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
Meh- DP, so I will turn it in something else.

I love this quote from this testing of the Athlon II X3 435 http://www.guru3d.com/article/athlon-ii-x3-435-processor-review-test/14:

Yes, the Athlon II X3 lacks the raw horsepower for gaming with high-end graphics cards. Fair enough, it's still plenty fast enough to play the games, don't get me wrong here. But in the lower resolutions you definitely can tell the difference.

The card used was a GTX280 OC. "Yes, MUM, I want a GTX280 OC with a i5 750 to play @ 1024x768 on my 22'-24' screen so it can show a bigger difference to the piss poor Athlon II X3 435, even if at higher resolutions there is nil difference."
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
The card used was a GTX280 OC. "Yes, MUM, I want a GTX280 OC with a i5 750 to play @ 1024x768 on my 22'-24' screen so it can show a bigger difference to the piss poor Athlon II X3 435, even if at higher resolutions there is nil difference."
Professional gamers do that a lot. Different strokes for different folks.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Professional benchers, you mean?

No, pro gamers do this all the time, particularly in FPS games. Typical is to go for a high resolution with absolute minimal detail, so that smoke/effects aren't as distracting, and it's easier to get headshots and for your brain to process data that way. Detail is mostly superfluous in those types of games, the only things that are important are seeing enemies/vehicles/etc.

IMHO it's almost cheating, as a player with certain effects/details disabled will be able to see straight through explosions/smoke that others can't see through clearly.

It varies from title to title, obviously. But for twitch PC FPS players, having a stout system still makes sense, as you never_ever_ever want the frames per second to dip below around 60. Besides whatever BS some people want to say about people not being able to see past 30fps, most decent FPS players can tell the improvements going from 30fps to 60fps gives, and if using a CRT or legit high-refresh LCD, going from 60 to 120fps. A proper sustainable 100+fps w/Vsync is gold, and back in the days of Q2/Q3/RTCW/Battlefield 1942/Battlefield 2 I kept an old 21" Sony CRT for a long time just to keep the ability to do 1600x1200 at 85hz and 1280x1024 at 100hz.

EDIT : To evolucion8, great post up there ^^ pretty much sums things up well. For all practical purposes, a 2.8Ghz X2 is about as fast as an E6400, maybe a shade quicker, but pretty equal. Either one is not going to see a 5850 to it's true potential.
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
But in the end, at my opinion, even the Conroe at 2.40GHz can be a serious bottleneck to a HD 5850 regardless. A Wolfdale Core 2 Duo at 3.0Ghz or more is more balanced thanks to it's improved IPC. But the i7 750 is a perfect match for it.
That's some excellent research, great post :).
No, pro gamers do this all the time, particularly in FPS games. Typical is to go for a high resolution with absolute minimal detail, so that smoke/effects aren't as distracting, and it's easier to get headshots and for your brain to process data that way. Detail is mostly superfluous in those types of games, the only things that are important are seeing enemies/vehicles/etc.

IMHO it's almost cheating, as a player with certain effects/details disabled will be able to see straight through explosions/smoke that others can't see through clearly.

It varies from title to title, obviously. But for twitch PC FPS players, having a stout system still makes sense, as you never_ever_ever want the frames per second to dip below around 60. Besides whatever BS some people want to say about people not being able to see past 30fps, most decent FPS players can tell the improvements going from 30fps to 60fps gives, and if using a CRT or legit high-refresh LCD, going from 60 to 120fps. A proper sustainable 100+fps w/Vsync is gold, and back in the days of Q2/Q3/RTCW/Battlefield 1942/Battlefield 2 I kept an old 21" Sony CRT for a long time just to keep the ability to do 1600x1200 at 85hz and 1280x1024 at 100hz.

EDIT : To evolucion8, great post up there ^^ pretty much sums things up well. For all practical purposes, a 2.8Ghz X2 is about as fast as an E6400, maybe a shade quicker, but pretty equal. Either one is not going to see a 5850 to it's true potential.
Exactly. Except I'll add that they never want to see the FPS dip below 100 (and that's where high-end CPUs come in).
 

AzN

Banned
Nov 26, 2001
4,112
2
0
if you think his 5600 X2 is remotely as fast as your Q6600@3.2 even in games that dont use more than 2 cores then you are very mistaken. turn off two of your cores and lower your cpu to 2.0 and you will still equal or in most cases outperform a 2.8 5600 X2.

You obviously don't see the obvious. From a 8800gt to 5850 he got 2x the performance at the same CPU clocks. Regardless of CPU it's still going to be 2x faster.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
I will take a setup with better minimum framerates and overall playabilty before I would worry about seeing how much AA I can use because I am cpu bottlenecked. personally I dont use that much AA anyway unless a game looks really jaggy. there are some games like Dead Space where you cant even use normal AA at all which is a little frustrating.

Try benchmarking any game @ 800x600 @ 2ghz with your GTX260. See what that does for your frame rates. I'm willing to bet it's doing wonders. When you eliminate GPU bottleneck that's what kind of FPS you will get regardless of resolution. With that in mind 2ghz core 2 duo is still fast enough to play any game out there except for bad console ports like GTA4.

Again GPU makes the biggest difference in games and @ 1920 as original poster's request. Just like 5970 can pump 2x the performance of 5850 with the same CPU in any of the reviews we have available so will 5850 coming from a 8800gt with a 2ghz core 2 duo.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Meh- DP, so I will turn it in something else.

I love this quote from this testing of the Athlon II X3 435 http://www.guru3d.com/article/athlon-ii-x3-435-processor-review-test/14:



The card used was a GTX280 OC. "Yes, MUM, I want a GTX280 OC with a i5 750 to play @ 1024x768 on my 22'-24' screen so it can show a bigger difference to the piss poor Athlon II X3 435, even if at higher resolutions there is nil difference."

They are comparing CPU's though. You test in lower resolutions to eliminate any GPU bottleneck. In this case i5 is the superior CPU for gaming.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
They are comparing CPU's though. You test in lower resolutions to eliminate any GPU bottleneck. In this case i5 is the superior CPU for gaming.

Except I don't frigging play at lower resolutions. Do you?

No, pro gamers do this all the time, particularly in FPS games. Typical is to go for a high resolution with absolute minimal detail, so that smoke/effects aren't as distracting, and it's easier to get headshots and for your brain to process data that way. Detail is mostly superfluous in those types of games, the only things that are important are seeing enemies/vehicles/etc.

And do they play at 1024x768?
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
Except I don't frigging play at lower resolutions. Do you?
+1, Funny and True. If you have to artificially lower your resolution just to be able to show significant CPU bottleneck because it is otherwise non-existent in higher resolutions, then there's effectively no CPU bottleneck. It only matters if it happens also in 1680x1050 or 1920x1080 or whatever high-res setting you are using.

Perhaps that's the point that should come across? CPU bottlenecks are more significant when playing in high-res.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
you can think whatever you want. I personally test all my games at 1920x1080 and a regardless of what you claim a cpu like his will bottleneck the crap out of a 5850 especially when it comes to minimum framerates. you can keep comparing a video card at different speeds where I was mainly focusing on how much actual performance would go to waste using cpu equivalent to his. considering just how much I lose even with a gtx260 then having a video card twice as fast would be a gigantic waste of its performance.
His CPU won’t bottleneck his 5850 if he runs at high enough detail levels (say 1920x1200 with 4xAA) because his GPU will bottleneck him the most, by far. His CPU is roughly equal to mine @ 2 GHz, and I can say that pairing mine @ 2 GHz with a Fermi/5870 would create a massive performance gain over my current GTX285.

Heck, I might even benchmark Fermi with my CPU @ 2 GHz just to prove my point, yet again.
but even using your type of benching in RE5, FEAR 2, Ghosbusters, Red Faction Guerrilla and some other games I get better performance an with my cpu at 3.16 and gpu at 465/971/1620 then I do with my cpu at 1.8(equal to his 5600 X2) and gpu at 666/1392/2200.
Where are the benchmark numbers? At what settings?

In the meantime, here’s another article where they compared an i7 920 at 2 GHz and 4 GHz with a 5970, and in all cases they got little to no performance change relative to the CPU being underclocked by 50%.

http://www.legionhardware.com/document.php?id=869&p=2

Incidentally they started off with Batman, and they didn’t even use any AA when they could’ve easily used 2xAA or 4xAA. Also Far Cry 2, Wolfenstein and Crysis are in there too, and their results back my claims.
 

SRoode

Senior member
Dec 9, 2004
243
0
0
You obviously don't see the obvious. From a 8800gt to 5850 he got 2x the performance at the same CPU clocks. Regardless of CPU it's still going to be 2x faster.

I guess that's what I'm saying here. For me, these are the results I got. Toyota seems to be overly concerned with CPU limitation. Granted, a more powerful CPU will increase the performance of a video card, but in most cases, you will see a greater improvement (at higher resolutions) when you put in a more powerful video card. That being said, I will agree that you may not be using ALL of your video card's potential (read - make the GPU the bottleneck instead of the CPU), but then again, how many of us out there are in that situation now?

Just because you do not push your video card to it's "limit" does not mean you will not see a huge increase in performance when you upgrade your video card. That is even more true today when you want to run your games at 1920x1200 (or beyond) with 16AF, 4 or 8x AA, and high quality textures.


Edit: Also, I'm not saying the OP will get a 2x or 3x increase. He may or may not be CPU limited. He will however see a huge increase in performance, even with his current CPU.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
His CPU won’t bottleneck his 5850 if he runs at high enough detail levels (say 1920x1200 with 4xAA) because his GPU will bottleneck him the most, by far. His CPU is roughly equal to mine @ 2 GHz, and I can say that pairing mine @ 2 GHz with a Fermi/5870 would create a massive performance gain over my current GTX285.

Heck, I might even benchmark Fermi with my CPU @ 2 GHz just to prove my point, yet again.

The thing is that 1920x1200= 2304400 pixels while 1024x768= 786432. That is almost 3x the number of pixels (2.93 to be exact).

A X CPU needs to be at least 3x faster than a Z CPU at 1024x768 to that factor be of any significant relevance in higher resolutions.

AI and physics can take a toll on systems, that is for sure. But still, those examples are still uncommon and extra cores with reasonable programing should solve it.

And unless you are playing such a game, you need to have a processor that is 3x or more times faster to bottleneck a GPU.

The future of course will see the replacement of dual-cores to quad-cores, as we seen dual-cores replace single cores (mostly because the game thread will run on 1 core and the GPU driver will run on another core), but we had quad-cores for 3 years and only a handful of games do take advantage of it.

I guess the real answer to the OP question is: most games won't see a CPU bottleneck at the 1920x1200, so a better GPU will significantly improve ur game experience. In a few other cases, the CPU will be the limiting factor, generally by requiring more cores and once you have those cores, raw CPU power importance ends and GPU becomes much more important again, ie, once you have a tri or quad-core (depends on the game) doesn't matter much if it is a Q6600, an athlon II X4 or an i7.

EDIT: A somewhat old article of a Athlon X2 5200+ vs Core2 Duo E6400 using a 8800GTX http://www.legionhardware.com/document.php?id=621&p=4
 
Last edited:

AzN

Banned
Nov 26, 2001
4,112
2
0
Except I don't frigging play at lower resolutions. Do you?

When you do get some more some breathing room with your video card at higher resolutions i5 is going to be faster. This is the reason why reviewers test at lower resolutions.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Edit: Also, I'm not saying the OP will get a 2x or 3x increase. He may or may not be CPU limited. He will however see a huge increase in performance, even with his current CPU.

That's what I see as well. You get more gains with a 5850 and a 2 ghz C2D than 8800gt with I7 4ghz ever will.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
When you do get some more some breathing room with your video card at higher resolutions i5 is going to be faster. This is the reason why reviewers test at lower resolutions.

That is true - but is still going to take time as GPUs will have to be 6 and 7 times faster than the previous one to have the CPU start making a difference.

That is why we only see i7 taking a bigger lead over phenom II and C2Q 9x50 in 2x 4870x2 or 2xGTX295.

Will be looking to see how single 5970, Xfire 5870 and 2x5970 scale with different PC architectures, although that article BFG10K linked shows that even by dropping the i7 speed by 50%, FPS drops are in the ~10% range at most.

No, not at all, because some of you still don't understand that bottlenecking is application dependent.

It is.

But games as a category, isn't really CPU bottleneck compared with other applications, is it?
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
ROFL.. This puts Toyota's theory out the window.
its not a theory at all. if I run my cpu at 1.8 its as fast as his 5600 x2. when I do that even at 1920 with AA it results in noticeably lower minimum and lower average framerates compared to using it at 3.16. heck some cpu very dependent games even become almost unplayable at times even with my cpu at 1.8. also SEVERAL games are FASTER with my gpu at 465 and cpu at 3.16 then with my gpu at 666 and cpu at 1.8 and thats a FACT.

common sense will tell you that if I am getting 15 fps for a minimum framerate because of my cpu then using a 5850 isnt going to change that. a 5600 X2 would limit a 5850 especially in the minimum framerate department regardless of what you think.

those i7 results you are looking at dont remotely compare to his 5600 X2. an i7 even at lower speeds can handle high end and multi gpus much better than even the Phenom 2 X4. an old 5600 X2 would pale in comparison if it was included.
 
Last edited:

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
its not a theory at all. if I run my cpu at 1.8 its as fast as his 5600 x2.

Are you gonna keep repeating the same old tired lie? I posted 2 links with proves and data that proved you wrong, his X2 at 2.80GHz is competitive with a Core 2 Duo Conroe 2MB cache running at 2.13-2.20GHz, period. A Core 2 Duo at 1.86 is only competitive to an Athlon running at 2.40Ghz-2.60GHz. Read my previous post with the links and educate yourself. In the end, both, the Core 2 Duo 6300 or the X2 at 2.80GHz will bottleneck it regardless, but not in a way that we will not be able to differentiate from his 8800GT in a performance perspective.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Are you gonna keep repeating the same old tired lie? I posted 2 links with proves and data that proved you wrong, his X2 at 2.80GHz is competitive with a Core 2 Duo Conroe 2MB cache running at 2.13-2.20GHz, period. A Core 2 Duo at 1.86 is only competitive to an Athlon running at 2.40Ghz-2.60GHz. Read my previous post with the links and educate yourself. In the end, both, the Core 2 Duo 6300 or the X2 at 2.80GHz will bottleneck it regardless, but not in a way that we will not be able to differentiate from his 8800GT in a performance perspective.
you do know my cpu is about 10-15% faster clock for clock than the E6xxx cpus dont you? yes the 1.86 E6300 could easily keep up with the 2.4 - 2.6 X2 cpus just like you said so do the math. my cpu at 1.8-1.9 would be an even match in games with a 5600 X2.
 
Last edited: