[Toms] CPU bottlenecking in games - the <$200 CPUs

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blckgrffn

Diamond Member
May 1, 2003
9,287
3,427
136
www.teamjuchems.com
The Phenom 955 has been discontinued just when it got to the $100 price range. A lot of places still have it listed in the $144 price range, such as amazon. New Egg shows the 955 has been discontinued and they are out of stock.

It seems to me that AMD is keeping the price of its processors artificially high. The 955 should be in the $75 price range by now. But instead, the prices are still high.

By the prices of older processors, it appears to me that AMD is trying to direct the market towards their newer CPUs.

I was really hoping to grab a 955 when the price hit $60 or $70, but I guess that aint gonna happen.

Second hand that shouldn't be hard....
 

scannall

Golden Member
Jan 1, 2012
1,960
1,678
136
Using a discrete GPU, and a 7970 at that of course moves the ball to Intel's court. What I was hoping for in that article was bang for the buck. For instance, I am thinking of building my kids an inexpensive desktop that can do some gaming. It would seem to me an A8 3870k for $140 would be great. But what kind of performance would I get spending that same $140 on an Intel CPU and a discrete GPU?

If you're spending big bucks on a GPU, then a sub $200 processor makes no sense anyway.
 

blckgrffn

Diamond Member
May 1, 2003
9,287
3,427
136
www.teamjuchems.com
Using a discrete GPU, and a 7970 at that of course moves the ball to Intel's court. What I was hoping for in that article was bang for the buck. For instance, I am thinking of building my kids an inexpensive desktop that can do some gaming. It would seem to me an A8 3870k for $140 would be great. But what kind of performance would I get spending that same $140 on an Intel CPU and a discrete GPU?

If you're spending big bucks on a GPU, then a sub $200 processor makes no sense anyway.

Sadly even there, a G530 or a G620 w/a 6770 or GTS450 will absolutely blow that A8 away. There is no comparison in gaming. The Ax series are not a good value... they have a smaller die than the old Athlon 2 X4's and should be priced that way, IMHO.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
You can buy a G530 w/ an HD 6570 and have similar total power consumption for less. I guess the multithreaded performance is the only pro on the A8 at this point. :(
 

RavenSEAL

Diamond Member
Jan 4, 2010
8,661
3
0
Ahh, now it makes sense. And by 'correctly optimized games' you mean ones that it doesn't absolutely suck performance-wise?

So you can buy an AMD CPU and only play 'correctly-optimized' games and constantly whine when games are not, or just buy Intel? I know which option I would choose.
When I mean correctly optimized games, I mean games that use more than two ****ing cores. :rolleyes:
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
When I mean correctly optimized games, I mean games that use more than two ****ing cores. :rolleyes:

I agree 100% that is the ideal, but sadly is not always the practice. I want my CPU to be fast in a game, regardless if the dev was good or lazy. :)
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Well, so do I, but we can't all afford 3930Ks. :D

RavenSeal: I'm fortunate enough to own 2 i5-2500k rigs and an 1100T rig. Though the Intel rigs are faster, the AMD 1100T is hardly a slouch and gives me a TRUE 6 core rig for a reasonable price.
 

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
Someone needs to devise a reliable way to test BF3 in multiplayer. Maybe joining a 64 player op metro server and sitting throwing grenades by the rightmost stairwell for 10 mins per CPU would give pretty consistent results.
 

Ieat

Senior member
Jan 18, 2012
260
0
76
It really is kind of sad that the $80 G630 beats all the Bulldozer chips...
 

OS

Lifer
Oct 11, 1999
15,581
1
76
If AMD would stop discontinuing their older product lines, they would still have an loyal following.

It seems that once the older CPUs drop below a certain price point, the model is phased out.

I am running an OLD amd 620 X4. As prices came down on the older CPUs I was hoping to do a CPU swap and upgrade to a phenom, but I dont guess that is going to happen.

Instead of buying an older AMD CPU at a discounted rate, my only real option now is to switch to Intel.

As old as the 955 is, it should cost way less then $100.

http://www.newegg.com/Product/Product.aspx?Item=N82E16819103808

Instead of buying a 955, I can spend a few dollars more and get an I3.


AMD probably wasn't making money on those old chips anyways.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Someone needs to devise a reliable way to test BF3 in multiplayer. Maybe joining a 64 player op metro server and sitting throwing grenades by the rightmost stairwell for 10 mins per CPU would give pretty consistent results.

I found between my e8200 and my i3 2100 a great way to test.

Run msi afterburner and enable gpu usage to be displayed as you role out of the russian base in caspain border in a tank out the mouth of the base on the main dirt road if you got a bottleneck cpu wise it will show as you pull out of there..

My old e8200@3.2ghz would drop as low as 80% gpu usage and drop into the low 30s fps wise sometimes as with the i3 2100 it will go to about 95% usage and in both the framerate would be low as also i found gunning a copter to be the second test of how reliable the cpu would be.

64 player metro is another example and i found my i3 2100 to be pretty lackluster there in some occasions as that map is loaded with plenty of explosions.
 

janas19

Platinum Member
Nov 10, 2011
2,313
1
0
I found between my e8200 and my i3 2100 a great way to test.

Run msi afterburner and enable gpu usage to be displayed as you role out of the russian base in caspain border in a tank out the mouth of the base on the main dirt road if you got a bottleneck cpu wise it will show as you pull out of there..

My old e8200@3.2ghz would drop as low as 80% gpu usage and drop into the low 30s fps wise sometimes as with the i3 2100 it will go to about 95% usage and in both the framerate would be low as also i found gunning a copter to be the second test of how reliable the cpu would be.

64 player metro is another example and i found my i3 2100 to be pretty lackluster there in some occasions as that map is loaded with plenty of explosions.

Not sure I understanding this correctly. You're saying the e8200 at 3.2gHz outperforms the i3-2100 in this BF3 bench?
 

ThatsABigOne

Diamond Member
Nov 8, 2010
4,422
23
81
Not sure I understanding this correctly. You're saying the e8200 at 3.2gHz outperforms the i3-2100 in this BF3 bench?

From what I read that his GPU usage with i3 goes down to 95%. So he is getting better performance from i3.
 

infoiltrator

Senior member
Feb 9, 2011
704
0
0
So far as I can tell the E8200 is still more expensive used than a new I3 2100.
Let's keep that reality in mind.
The I5 2400 appears the sweet spot, the I3 2100/2120 the budget kings.
I've been looking to keep AMD CPU purchases under $80, not enough
 

justin4pack

Senior member
Jan 21, 2012
521
6
81
Im using an E8500 and it runs bf3 like a champ. i run around 50fps steady with a 5870. little lag at the start of a match but thats it. that is on high 64p maps.
 

beginner99

Diamond Member
Jun 2, 2009
5,228
1,603
136
I'm glad to see the SC2 benchmarks. I thought that my graphics card (5770) was the reason for the inconsistent frame rate, but the results of the AMD Phenom II X6 1090T pretty much bears out what I'm seeing here, though I have to check whether I'm using the "extreme" setting.

The odd thing though is that I'm pretty sure I've checked CPU usage during a game of SC2 and was getting the sort of core usage that a second game of SC2 might be possible at the same time :)

- edit - just checked my SC2 settings, pretty much 'Extreme' except I disabled "indirect shadow" or something like that which only affects cut scenes apparently.

This is why AMDs "more corez" approach is retarded. SC2 AFAIK is limited to 2 threads. 4 of the core in your x6 will be useless but if the 2 used ones can't keep up -> bad frame-rate.

Honestly these results are as expected. 8 Core AMD FX loses to a dual core 80$ Pentium budget CPU...It is just sad. On the desktop-side there is just not a single AMD CPU that can be recommended for any task. APUs have the down-side that you can't individually upgrade 1 part (CPU or GPU).
A G620 + 6570 costs about the same as a A8 APU but with better gaming performance (Note: Only reason to get an APU is for gaming right? Else intel graphics would be enough). And later you can upgrade your GPU if desired.

Of course this is a bit different on notebook side and AMD might have a niche there but a small one IMHO.
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
My hats off to Tom and the gang, AMD's pricing structure's on BD chips are horrendous and its time this gets addressed, they're insane for charging what they are right now. From process tech to performance across the board BD is a value CPU and I think it was lame of AMD to price these things without considering the lack in process quality and single threaded performance they're so behind in. The PSU requirement for running BD with SLi/CrossFire config is ridiculous if you plan to OC, just absolutely of the charts ridiculous. You'd be foolish to run it with anything less than a 1000watt unit.

AMD needs to pull a rabbit out of a hat if it wants to charge these kinds of price's, as none of their top end CPU's should be over $150/$160 right now if you take current market standards into consideration. There's just no compelling reason to own these chips.
 

monkeydelmagico

Diamond Member
Nov 16, 2011
3,961
145
106
So far as I can tell the E8200 is still more expensive used than a new I3 2100.
Let's keep that reality in mind.
The I5 2400 appears the sweet spot, the I3 2100/2120 the budget kings.
I've been looking to keep AMD CPU purchases under $80, not enough

Absolutely agree with everything you just said.

FX 4100 for $60-$70.- is the right price and might bring the low budget gaming chip crown back into the AMD camp.
 

ed29a

Senior member
Mar 15, 2011
212
0
0
I agree 100% that is the ideal, but sadly is not always the practice. I want my CPU to be fast in a game, regardless if the dev was good or lazy. :)

Yup, people who work 14 hours a day, often 6 days a week are 'lazy'. People whining about lack of true multithreaded games needs to realize a few things:

(1) It's extremely hard to write good MT code.
(2) Most games are console ports, no need for MT given current processor power.
(3) When your publisher is breathing down your neck, you don't have time to fiddle with multithreading, you need to ship ... yesterday.
 
Last edited:
Aug 11, 2008
10,451
642
126
No need to beat up on RavenSEAL. I have 2 i5-2500k rigs and an AMD 1100T rig. No doubt the SB chips are faster but the 1100T is hardly a slouch.

Well, when someone presents an argument that uses only a very handpicked situation to try to prove a point which goes agaisnt the generally accepted facts, uses an invalid comparison to an old CPU which is bested by a newer CPU at 1/4 the price, and blames AMDs failures on the game programmers, I think it is fair to criticize them.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
my hats off to tom and the gang, amd's pricing structure's on bd chips are horrendous and its time this gets addressed, they're insane for charging what they are right now. From process tech to performance across the board bd is a value cpu and i think it was lame of amd to price these things without considering the lack in process quality and single threaded performance they're so behind in. The psu requirement for running bd with sli/crossfire config is ridiculous if you plan to oc, just absolutely of the charts ridiculous. You'd be foolish to run it with anything less than a 1000watt unit.

Amd needs to pull a rabbit out of a hat if it wants to charge these kinds of price's, as none of their top end cpu's should be over $150/$160 right now if you take current market standards into consideration. There's just no compelling reason to own these chips.

agreed!
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
It really is kind of sad that the $80 G630 beats all the Bulldozer chips...

You think this is bad wait till you see the 2.4GHz BD Phenom X8s. It's almost as if AMD doesn't feel the FX line wasn't humiliated enough already they plan to do the the same to their Phenom line.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
The PSU requirement for running BD with SLi/CrossFire config is ridiculous if you plan to OC, just absolutely of the charts ridiculous. You'd be foolish to run it with anything less than a 1000watt unit.

I have a FX8150 @ 4.7GHz with CrossFire 2x HD6950 @ 889MHz with a ThermalTake 730W 80+ PSU.

Exaggerating much are we ??

projectfxcf1.jpg
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
It really is kind of sad that the $80 G630 beats all the Bulldozer chips...


OK, this is incorrect. I just ran into a big issue with my Celeron G530 last night while I was backing up my steam games. It literally crawled to almost a stop while I was running the backup. If I were to open another program, it caused Steam to freeze because both cores were stresses 100%. Single threaded performance may be better but as soon as you start running multiple applications on the SB Pentium/Celerons, they will start crawling! I am even using an SSD for my OS too.


EDIT: As soon as they start making it a standard that all games are written for quad core, we will be watching all the dual cores choke.