How bottlenecked will this be?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cbn

Lifer
Mar 27, 2009
12,968
221
106
I agree with other poster that the money would be better spent on a 4890 and a X4.

I would argue the 4980 + X4 would beat a X2 + 5850 in most cases, plus you would have a much faster overall system for other tasks as well. Will the system run games well? Sure. But a lot of performance is wasted, hence $$$$ is wasted.

If he can run a Phenom II on that mobo what you are talking about probably makes more sense from a value standpoint.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Well that is true.

But why would someone buy a new video card if they were satisfied with current graphical settings?

Most likely he is turning down settings of his video card in order to play newer games. If that is happening then there would evidence of a GPU bottleneck (otherwise there wouldn't be much difference in FPS).
and why buy a $300 card when a $150 card can give you the same basic playability? to me if I am looking at video cards and the substantially more expensive one isnt going to deliver better minimums and overall playability because of my cpu then I would go with the much cheaper card.

we are all just wasting our time though because the op has already said he will be upgrading his cpu.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
and why buy a $300 card when a $150 card can give you the same basic playability?

I need to look more at monitor and AA comparisons. Because to me being able to run higher AA ($300 card vs $150 card) is just about paper benchmarks unless the image actually looks better to the eyeball.

P.S. HD5xxx does have supersampling AA and I have read that one produces nicer looking images.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
World in Conflict is one of the games that benefits from Quad Core, so I’ve looked into it a bit.

Here a dual-core E6850 at 3 GHz (stock) is on par with a quad-core Q6600 overclocked to 2.7 GHz: http://www.bjorn3d.com/read.php?cID=1162&pageID=3901

Now look at the differences in graphics cards: http://www.bjorn3d.com/read.php?cID=1162&pageID=3904

Even at low quality the 8800 GTX is far faster than a 2600XT, and this is without AA. And that gap grows as the detail is increased, like at 1920x1200 @ medium details, where it’s more than twice as fast.

This yet again proves that even in a fringe game like this that takes advantage of quad-core, the GPU makes the biggest difference by far.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
again this isnt an choice of get a cpu or get a video card. the op asked if he would be bottlenecking a 5850 with his 5600 X2 and the simple answer is YES. it would be a massive amount of performance down the drain.
Did you see my article? Most tests were run at 1920x1200 with 2xAA, yet the effects of me dropping my processor to 2 GHz were insignificant overall. In contrast, I was getting linear or near linear scaling from doing the same to my GPU.

If he runs 1920x1200 with AA he’ll see little to no performance going down the drain because the effect of his CPU will be trivial compared to the massive bottlenecking he’ll get from his 5850.

Again, the effect of the CPU is vastly overstated, while the effect of the GPU is vastly understated.
 

sandorski

No Lifer
Oct 10, 1999
70,677
6,250
126
Why worry about it. You said you're Upgrading the CPU soon, so it doesn't matter.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Like this 4850 http://www.tomshardware.com/reviews/build-balanced-platform,2469.html that regardless of being used on Pentium E6300 or an i7 920 doesn't improve squat?

Again, same article, World in Conflict, http://www.tomshardware.com/reviews/build-balanced-platform,2469-14.html .

Interesting results.

I wish they would release the AMD article already.
Very insightful article indeed, great find. I like it because it shows not only the relationship between bottlenecking, but also its application dependence. Again, I have to stress the importance of considering application when discussion bottlenecking. While one can make generalizations when making a recommendation, it's important to remember the scope of use.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Did you see my article? Most tests were run at 1920x1200 with 2xAA, yet the effects of me dropping my processor to 2 GHz were insignificant overall. In contrast, I was getting linear or near linear scaling from doing the same to my GPU.

If he runs 1920x1200 with AA he’ll see little to no performance going down the drain because the effect of his CPU will be trivial compared to the massive bottlenecking he’ll get from his 5850.

Again, the effect of the CPU is vastly overstated, while the effect of the GPU is vastly understated.
yes I saw your article but I have completely different experience than you in most games. your results certainly dont reflect the differences many games really show when using different cpu speeds. my numbers seem right in line with what sites like pcgh get when they do a cpu benchmark for a game where yours do not.

I also play through different levels at the lower cpu speeds to see what real world difference are there. in games like Far Cry 2, Red Faction Guerrilla, RE5, Ghostbusters, Batman AA, Crysis and others show a very significant framerate drop. some of those games become noticeably sluggish at times and games like Ghostbusters, GTA 4 and Red Faction Guerrilla become almost unplayable during action. sure there are plenty of games where it makes very little difference in gameplay but even in most of those cases I see a noticeable framerate difference.

I still stand by what I have said since I know what I am seeing in games and again my results reflect what pcgh gets in those games too. I have probably spent way more time testing all my games at different cpu speeds than actually playing them. sticking a 5850 with a cpu like a 5600 X2 is still a very big waste of what that video card can do.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Why worry about it. You said you're Upgrading the CPU soon, so it doesn't matter.

I guess this thread has been hijacked. :)

But for a good reason.

The truth is most of us don't have 4-5 systems in our homes with several different performance/architectures CPUs and GPUs.

Most of us read articles about CPUs and pretty much without exception those articles show the CPU game performance at medium/lower quality settings, medium/low resolutions with a very strong graphic card.

This is done to make sure you get no GPU influence.

Likewise, most of us read articles about GPUs and pretty much without exception those articles use the most powerful/fastest CPU available.

So as you can see, we have a test that doesn't matter because no one that has a powerful GPU will play at low/medium quality and even low resolutions and then we have an incomplete test because only a CPU is used.

So, our minds translate that difference in the CPU game tests (and other non-game tests) to differences in high/very-high quality at medium-high resolutions.

And the "CPU bottleneck for gaming myth"(?) is created.

That myth(?) basically states that you won't see differences in performance between some 2 given cards because the CPU won't allow the faster/superior card to extend its legs.

Then we arrive at statements like "the X2-5600+ will hamper/bottleneck the performance of any card faster than the xxxx card. Getting that yyyy card that is faster won't change a thing".

Generally those arguments are only valid in certain specif situations - most common low resolutions, and even then high amount of AA and very-high image quality can show differences.

A more accurate statement is "Card x already gives you great performance in that game at given resolution/max image quality, so a faster/more expensive card is a waste as you wont notice any gameplay difference".

Does that mean CPU don't matter at all?

No CPU matters - a single core CPU and even the dual-core Pentium D are inadequately to play current games.

Other cases of CPU bottleneck happens when a game can use more cores than the current CPU possess (which is pretty much why single cores are out of modern gaming); in multiple GPU configurations, as the driver that balances the game load runs on the PC (and actually current drives are multi-threaded), in cases where the CPU limits the minimum frame rate due to AI/physics situations like MMO and RTS or when the game can take advantage of higher amount of cache/L3$.

Of these situations the more common are multiple GPU-configurations and AI/physics being the handicap, but those really can be solved by better multi-threading programing .

Still, those situations are far from common and that explains why a 4850 or a 4890 most of the time don't care what the CPU is, as long it is a 2GHz Athlon X2 or better.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I guess this thread has been hijacked. :)

But for a good reason.

The truth is most of us don't have 4-5 systems in our homes with several different performance/architectures CPUs and GPUs.

Most of us read articles about CPUs and pretty much without exception those articles show the CPU game performance at medium/lower quality settings, medium/low resolutions with a very strong graphic card.

This is done to make sure you get no GPU influence.

Likewise, most of us read articles about GPUs and pretty much without exception those articles use the most powerful/fastest CPU available.

So as you can see, we have a test that doesn't matter because no one that has a powerful GPU will play at low/medium quality and even low resolutions and then we have an incomplete test because only a CPU is used.

So, our minds translate that difference in the CPU game tests (and other non-game tests) to differences in high/very-high quality at medium-high resolutions.

And the "CPU bottleneck for gaming myth"(?) is created.

That myth(?) basically states that you won't see differences in performance between some 2 given cards because the CPU won't allow the faster/superior card to extend its legs.

Then we arrive at statements like "the X2-5600+ will hamper/bottleneck the performance of any card faster than the xxxx card. Getting that yyyy card that is faster won't change a thing".

Generally those arguments are only valid in certain specif situations - most common low resolutions, and even then high amount of AA and very-high image quality can show differences.

A more accurate statement is "Card x already gives you great performance in that game at given resolution/max image quality, so a faster/more expensive card is a waste as you wont notice any gameplay difference".

Does that mean CPU don't matter at all?

No CPU matters - a single core CPU and even the dual-core Pentium D are inadequately to play current games.

Other cases of CPU bottleneck happens when a game can use more cores than the current CPU possess (which is pretty much why single cores are out of modern gaming); in multiple GPU configurations, as the driver that balances the game load runs on the PC (and actually current drives are multi-threaded), in cases where the CPU limits the minimum frame rate due to AI/physics situations like MMO and RTS or when the game can take advantage of higher amount of cache/L3$.

Of these situations the more common are multiple GPU-configurations and AI/physics being the handicap, but those really can be solved by better multi-threading programing .

Still, those situations are far from common and that explains why a 4850 or a 4890 most of the time don't care what the CPU is, as long it is a 2GHz Athlon X2 or better.

all that typing but in the end you come up with some random cpu that you deem is sufficient enough? lol. many cpu benchmarks like the ones from pcgameshardware.com are actually ran at realistic games settings and resolutions. it is very clear that when running higher end gpu that much of its performance could never be attained with a low end cpu. your 2.0 Athlon X2 example would bottleneck the crap out of even a decent midrange card and would be laughable with a 4890. that would be like me running my cpu at 1.3 and I guarantee you that even at 1920 I would lose 40-50% of what a 4890 could even do in modern games. hell even if you dont factor in how much performance would go down the drain just putting my cpu at 1.3 would make many modern games mostly unplayable anyway.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
your 2.0 Athlon X2 example would bottleneck the crap out of even a decent midrange card and would be laughable with a 4890.

If I change it to Core2Duo E6400 will you be happier?

Edit: the reality is tests like these are really hard to come by, but lets look at pcgameshardware.com:
Crysis Warhead: Scalability of CPU clock speed and number of cores

8800GTX which is equivalent, give or take to a 4850 decent mid-range GPU.

Q6600 with 2 cores which basically means it is a E6600.

8800GTX - 2 cores @ 2.6GHz
1024x768 Enthusiast:
min/avg - 29.0/33.2

4870- 2 cores @ 2.6GHz
1024x768 Enthusiast:
min/avg - 31.0/41.9

8800GTX - 2 cores @ 3.5GHz
1024x768 Enthusiast:
min/avg - 29.0/33.2

4870GTX - 2 cores @ 3.5GHz
1024x768 Enthusiast:
min/avg - 34.0/43.9
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
If I change it to Core2Duo E6400 will you be happier?
yes I would be happier. lol. I personally would say an E6600 is about the lowest cpu I would recommend with a 4890 level of card. even then overclocking would help quite a bit in several games.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
yes I would be happier. lol. I personally would say an E6600 is about the lowest cpu I would recommend with a 4890 level of card. even then overclocking would help quite a bit in several games.

Just don't expect much in for example Crysis Warhead:

http://www.pcgameshardware.com/aid,...ock-speed-and-number-of-cores/Reviews/?page=3

4870
Enthusiast - 1680x1050
E6600 @ 2.6 (actually Q6600 with only 2 cores)
min/avg - 23/26.3

E6600 @ 3.5 (actually Q6600 with only 2 cores)
min/avg - 23/27.3

I don't mean to resurface the argument, but at least at stock from reviews i've read, the 5600+ and e6600 are relatively close.

http://www.firingsquad.com/hardware/amd_athlon_64_x2_5600/page9.asp

If I were you, I would still upgrade your CPU - a quad core of a newer architecture would improve your gameplay and non-gameplay experience, but don't feel too bad about having 5850 in that rig for the time being, especially with you running at 1900x1200.

This thread for more articles about CPU vs GPU bottleneck.

http://forums.anandtech.com/showthread.php?p=28968579#post28968579
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
I don't mean to resurface the argument, but at least at stock from reviews i've read, the 5600+ and e6600 are relatively close.

http://www.firingsquad.com/hardware/amd_athlon_64_x2_5600/page9.asp
look at newer game reviews because the E6600 is usually quite a bit better. in newer games an E6600 usually matches or beats a 3.2 Athlon 64 X2. plus I said that would be the minimum I would recommend for a 4890. a 5850 is a bit better and I would certainly recommend a little faster cpu. a modern quad like a Phenom 2 or especially i5/i7 would deliver the best overall experience with a 5850.


EDIT: see here where the E6600 smokes even the 3.2 6400 X2 by over 30%. http://www.pcgameshardware.com/aid,...nom-strong-Update-Lynnfield-results/Practice/
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
look at newer game reviews because the E6600 is usually a bit better. plus I said that would be the minimum I would recommend for a 4890. a 5850 is a bit better and I would certainly recommend a little faster cpu. a modern quad like a Phenom 2 or especially i5/i7 would deliver the best overall experience with a 5850.

Curiously I've found this - still trying to see what is their test method.

http://translate.google.com/transla...fikchipliste/Leistung_Graka.htm&sl=auto&tl=en
 

SOWK

Junior Member
Nov 25, 2009
5
0
0
Currently have

AMD X2 5600+ 2.8ghz
8800GT
4GB DDR2 Gskill

I just ordered my Gigabyte ATI 5850 and am patiently awaiting for it to be available. I'm thinking of upgrading with boxing day sales to a AMD Phenom X4 965 or I5 750. Regardless, im just wondering how big of a bottleneck I will be placing on this video card with my current cpu?

Games run at 1920x1200

Very simple to answer!!!!


Just run the games at 640X480. Is there a performance increase?

If not a new GPU is not going to help.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Very simple to answer!!!!


Just run the games at 640X480. Is there a performance increase?

If not a new GPU is not going to help.

Under clocking his CPU and GPU will show a better effect at target resolution.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
A case of game that like cores and cache - far from being the norm though but something to consider for the future.

It would be interesting to understand exactly what makes this game so CPU bound - physics?
well my point was that in newer more cpu intensive games the E6600 puts even more distance over the old Athlon 64 X2 cpus.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
well my point was that in newer more cpu intensive games the E6600 puts even more distance over the old Athlon 64 X2 cpus.

No doubt - core2 architecture is superior to K8.

Most of the time though, games tend to be more GPU intensive than CPU intensive.

Even a E8400 is 30% slower than Q9550.
 
Last edited:

SOWK

Junior Member
Nov 25, 2009
5
0
0
Under clocking his CPU and GPU will show a better effect at target resolution.

How would under clocking his system show him the benefits of getting a new video card?


To see if his performance falls?

Example A:

Underclock CPU: Game frame rates stay the same, Get Better GPU?

Example B:

Underclock GPU: Game Frame rates stay the same, Get Better CPU?

is this the formula you are trying to get across?
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126


How would under clocking his system show him the benefits of getting a new video card?


To see if his performance falls?

Example A:

Underclock CPU: Game frame rates stay the same, Get Better GPU?

Example B:

Underclock GPU: Game Frame rates stay the same, Get Better CPU?

is this the formula you are trying to get across?

Well if you underclock or overclock your GPU by a certain percentage and your frame rate drop/increase by the same percentage, it shows the game is using all the GPU power it can find.

Likewise, for CPU.

Of course sometimes it's about CPU/GPU architecture.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
and why buy a $300 card when a $150 card can give you the same basic playability? to me if I am looking at video cards and the substantially more expensive one isnt going to deliver better minimums and overall playability because of my cpu then I would go with the much cheaper card.

we are all just wasting our time though because the op has already said he will be upgrading his cpu.

Because 8800gt won't let you play 1920x1200 comfortably.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Because 8800gt won't let you play 1920x1200 comfortably.
I dont think you closely read my comment because your reply doesnt really make sense. I never said anything about his current 8800gt.
 
Last edited: