Considering a cpu upgrade......is it worth it?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nyker96

Diamond Member
Apr 19, 2005
5,630
2
81
I think the cheaper upgrade route if you don't play too much games while doing more multitasking stuff would be a PII x4 or x6, you can pick up an used x6 on ebay for 125 or so. That's a decent option if you don't game much.

but if you gonna move to Intel in near future and just want something to hold you over, I would get a used PII x4 for about 80 bucks on ebay, that will boost your daily performance immensely, probably a bit on games too.

The 2nd option of dumping a used x4 into the system for 80 bux would be my personal choice in your situation. hope this helps.
 

chucky2

Lifer
Dec 9, 1999
10,018
37
91
OP, can your BIOS unlock a Phenom II X2 into an X4? If it can I'll sell you a Phenom II X2 550 cheap. That should hold you till Haswell (which should be what you're trying to hold out for).

Chuck
 

Hatisherrif

Senior member
May 10, 2009
226
0
0
Gotcha. Because raising the resolution will improve your framerate. Makes perfect sense.


EDIT: A PII 965 is about as fast per clock in games as a Core2 Quad Q6600.

Core2Quad was released in fall of 2006 and was two Core2Duo's on the same package. I have one in my 2nd rig, I know exactly how limiting it is in games.

It certainly won't change it on the Phenom II, making all further discussion pointless. You are mixing things up here, I never said that any system will perform better on a higher resolution, but considering the fact that the CPU is the bottleneck in this situation, at a higher resolution that will no longer be the case.

"Per-clock" is an invalid argument when talking about Intel Core2 Quads, because Anand has proven long ago that even its highest end members have problems with "microstutter" and weird FPS dips for some unknown reason. The article I am referring to is this one, and it is especially true for Crossfire (but I have noticed it on single cards in my own experience):

We are back today with a quick update to an article we did a few weeks ago. That article addressed readership questions about how well the Phenom II X4 940 performed against a similar Intel Core 2 Quad (Q9550 in this case) with a multi-GPU setup. It was an interesting request and one that we enjoyed answering. Without repeating the entire article, we discovered the X4 940 was every bit a match for the Q9550 in the majority of our multi-GPU game tests. The one exception was Far Cry 2, but that title just favors Intel’s processors, especially the i7 series.

- http://www.anandtech.com/show/2740

I cannot really find the other review at the moment, the one with the 940, but you get the idea.


Plus, it is very disrespectful that people these days do not appreciate Phenom II 955's success when it came out. Anand concludes:

"Compared to the Core 2 Quad Q9550 the new X4 955 generally comes out ahead. From a longevity standpoint, the AM3 platform is much wiser to invest in than LGA-775."

The Phenom II certainly was a better all round CPU in its time, and comparing it to some very old Intel chips is absurd (and mostly done by people who had never owned it). Yes, let's face it, the Bulldozer was not a blissful creation by any means, but what everybody is trying to do today is to make it look like AMD never had competitive products after 2006. Truly, the Phenom II could not match Intel on the enthusiast end, but if only the Phenom II had come out a little bit sooner, it would have been better than the Core2 Quads. Only if the i7's hadn't shown up...

So, even if you have a Core2 Quad and it is limiting you, you cannot know that the Phenom II would be the same because there are just so many factors to consider. In fact, I believe it would do even better.

Glad I could clear some things up.
 

chucky2

Lifer
Dec 9, 1999
10,018
37
91
AM2 actually has even longer legs. You can drop AM3 chips in there that aren't too shabby. As a platform AM2 gave its users a really long run courtesy of AMD equipping the AM2+ and AM3 CPU's with a DDR2 memory controller. Pretty nice of them...

Chuck
 

jcwagers

Golden Member
Dec 25, 2000
1,150
14
81
OP, can your BIOS unlock a Phenom II X2 into an X4? If it can I'll sell you a Phenom II X2 550 cheap. That should hold you till Haswell (which should be what you're trying to hold out for).

Chuck

I'm afraid it won't but I appreciate the offer. :) To all of you who have replied, I just wanted to say thanks for taking the time to check out this thread and for giving your opinions. :)

-jc
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
It certainly won't change it on the Phenom II, making all further discussion pointless. You are mixing things up here, I never said that any system will perform better on a higher resolution, but considering the fact that the CPU is the bottleneck in this situation, at a higher resolution that will no longer be the case.

"Per-clock" is an invalid argument when talking about Intel Core2 Quads, because Anand has proven long ago that even its highest end members have problems with "microstutter" and weird FPS dips for some unknown reason. The article I am referring to is this one, and it is especially true for Crossfire (but I have noticed it on single cards in my own experience):

We are back today with a quick update to an article we did a few weeks ago. That article addressed readership questions about how well the Phenom II X4 940 performed against a similar Intel Core 2 Quad (Q9550 in this case) with a multi-GPU setup. It was an interesting request and one that we enjoyed answering. Without repeating the entire article, we discovered the X4 940 was every bit a match for the Q9550 in the majority of our multi-GPU game tests. The one exception was Far Cry 2, but that title just favors Intel’s processors, especially the i7 series.

- http://www.anandtech.com/show/2740

I cannot really find the other review at the moment, the one with the 940, but you get the idea.


Plus, it is very disrespectful that people these days do not appreciate Phenom II 955's success when it came out. Anand concludes:

"Compared to the Core 2 Quad Q9550 the new X4 955 generally comes out ahead. From a longevity standpoint, the AM3 platform is much wiser to invest in than LGA-775."

The Phenom II certainly was a better all round CPU in its time, and comparing it to some very old Intel chips is absurd (and mostly done by people who had never owned it). Yes, let's face it, the Bulldozer was not a blissful creation by any means, but what everybody is trying to do today is to make it look like AMD never had competitive products after 2006. Truly, the Phenom II could not match Intel on the enthusiast end, but if only the Phenom II had come out a little bit sooner, it would have been better than the Core2 Quads. Only if the i7's hadn't shown up...

So, even if you have a Core2 Quad and it is limiting you, you cannot know that the Phenom II would be the same because there are just so many factors to consider. In fact, I believe it would do even better.

Glad I could clear some things up.

I didn't mean to give the impression that I thought the Ph II was inferior to a Core2Quad, but rather that it was comparable. I feel they're similar enough chips to warrant comparison. If I were given a choice between the two today I'd opt for the AMD chip without a second thought. Rather, what I was trying to suggest was that I do have basis for comparison as I have an Ivy Bridge CPU in my main rig and an older Core2Quad in a PC about 6ft away, with the same video card and amount of RAM.

When you made this statement...

Yeah, if you want to play on 1024x768, a Phenom II is terrible.

... what I thought you were implying was that 33fps minimum / 45 average from a CPU was fine, and they should just test at high enough resolutions and graphical settings to bring every system down to 33fps (which I find unacceptable).

Since we're trying to isolate CPU performance, resolution has zero relevance, as we're looking for a CPU that can deliver smooth gameplay. Regardless of what graphical settings you test at, a Phenom II simply isn't going to deliver higher frames, so testing at 1024 puts the video card out of the equation entirely.

They did in fact test at higher resolutions (if you check out the article on GameGPU), but in those tests they normalized the CPU across video cards (much like the normalized the video card across CPUs for the CPU test) so we could see relative video card performance without an uncontrolled variable making comparison meaningless.
 
Last edited:

Hatisherrif

Senior member
May 10, 2009
226
0
0
Sure, but you should then also test if the CPU even matters at higher resolutions. I do believe that the i5 would be much better than most CPUs in terms of minimum frame rates. But you must acknowledge that nobody today plays at 1024x768, so testing the game at that resolution hoping to get relevant data is not a good idea. It can serve to show that if provided with a better graphics card which can eliminate the "GPU bottleneck", the i5/i7 would be miles ahead of the Phenom, which would always lock at 33/45 or whatever it was.

My point was that when you want to get a graphics card which isn't that grand itself, lifting the CPU bottleneck doesn't mean much, as you will be limited by that card's performance on higher resolutions. That is a thing to think about. If you want to isolate CPU performance, it is clear what is what. But if you want to get a graphics card in the package with a CPU, you should only consider getting the CPU which will not bottleneck the specific graphics card you want to get, anything above that is overkill. Except for future proofing, of course, but that is another factor and issue to discuss.
 
Last edited:

Hubb1e

Senior member
Aug 25, 2011
396
0
71
Sure, but you should then also test if the CPU even matters at higher resolutions. I do believe that the i5 would be much better than most CPUs in terms of minimum frame rates. But you must acknowledge that nobody today plays at 1024x768, so testing the game at that resolution hoping to get relevant data is not a good idea. It can serve to show that if provided with a better graphics card which can eliminate the "GPU bottleneck", the i5/i7 would be miles ahead of the Phenom, which would always lock at 33/45 or whatever it was.

My point was that when you want to get a graphics card which isn't that grand itself, lifting the CPU bottleneck doesn't mean much, as you will be limited by that card's performance on higher resolutions. That is a thing to think about. If you want to isolate CPU performance, it is clear what is what. But if you want to get a graphics card in the package with a CPU, you should only consider getting the CPU which will not bottleneck the specific graphics card you want to get, anything above that is overkill. Except for future proofing, of course, but that is another factor and issue to discuss.

I'm sorry but your entire statement is completely wrong. Lowering the resolution removes the GPU so you can test the baseline framerate the CPU can do on its own. This is why they tested the CPUs the way they did.

With a GPU you can always lower graphics settings and resolution so you are actually never GPU limited unless your GPU is completely incapable of playing at the lowest settings possible. You can create a GPU limited scenario by increasing resolution and quality, but that was not the point of this particular graph. Everyone should always try for at least a 45fps average on a first person shooter. 60fps is preffered. And since you can adjust the graphics quality and resolution you can always tailor the graphics to your graphics card 's performance so you can reach at least 45fps as long as you aren't already on low.

What the graph shows is that even when there is NO GPU bottleneck at all, the PhII can only muster 45fps average in this particular test. You could put in Quad Crossfire with 7970 ghz editions and you would never get above 45 fps average with the PhII. Still, compared to the OP's Athlon X2, the PhII is a major step up and worth the $100 or so in my opinion. I would suggest you find one used on eBay. They are selling a lot of them. Going Intel i5 at this point would also make sense, but only if the OP can swing the serious difference in cash that would require.
 
Last edited:

Hubb1e

Senior member
Aug 25, 2011
396
0
71
You must be joking
Of course you would get above 45fps. On that map and under the exact conditions of the test in that graph you would not average over 45fps even with quad crossfire since the game is CPU limited under a PhII.
 

Hatisherrif

Senior member
May 10, 2009
226
0
0
I'm sorry but your entire statement is completely wrong. Lowering the resolution removes the GPU so you can test the baseline framerate the CPU can do on its own. This is why they tested the CPUs the way they did.

With a GPU you can always lower graphics settings and resolution so you are actually never GPU limited unless your GPU is completely incapable of playing at the lowest settings possible. You can create a GPU limited scenario by increasing resolution and quality, but that was not the point of this particular graph. Everyone should always try for at least a 45fps average on a first person shooter. 60fps is preffered. And since you can adjust the graphics quality and resolution you can always tailor the graphics to your graphics card 's performance so you can reach at least 45fps as long as you aren't already on low.

What the graph shows is that even when there is NO GPU bottleneck at all, the PhII can only muster 45fps average in this particular test. You could put in Quad Crossfire with 7970 ghz editions and you would never get above 45 fps average with the PhII. Still, compared to the OP's Athlon X2, the PhII is a major step up and worth the $100 or so in my opinion. I would suggest you find one used on eBay. They are selling a lot of them. Going Intel i5 at this point would also make sense, but only if the OP can swing the serious difference in cash that would require.

I think you should re-read my statement. When there is no GPU bottleneck, the Phenom averages 45. Okay, so that is the top. What if the graphics card cannot manage more than that? Is it worth buying a better CPU in that case? That is the matter I was trying to discuss.

Of course the i5 is better, is there anybody here denying that? I certainly am not. I'm just urging everybody to think about this. Components have to complement each other. If you go overkill on any of them you are going to have a bottleneck. A Phenom will bottleneck most graphics cards today, the only question is how much and in which situations. If you can answer all of these, then you can make a good decision on what you want to buy.

Personally, I would not get the Phenom II, but it depends on what you have and what you want to get. It will certainly be a major improvement over what the OP has, but any Intel CPU better than i3 (2nd gen & above) will just be better all round.
 

Hubb1e

Senior member
Aug 25, 2011
396
0
71
The graphics card can do 45fps. It all depends on what quality you want to set and that is a separate issue from the CPU. If the OP is unhappy with the quality he can play at then he needs to upgrade his GPU, but set the lowest resolution and the lowest graphics settings to see what his frames are and it will will tell him if he needs a new CPU. That's an easy test and one that gets confused all the time because people always confuse the matter with the question "are you GPU limited?" The anwser to that is lower your graphics settings and you are no longer GPU limited. Then it becomes a question of whether he is happy playing at lowered settings. But you can't normally lower CPU settings except in those cases that they give you physics settings such as Diablo III.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
Gosh Sandy pownz! I wonder how fast and powerful their Xeon line is.

Is Haswell a 6 core Haswell should blow away sandy and ivy, thats 2 extra cores plus a sick oc to 5Ghz ,,,,,,, add 64GB RAM and your set.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Hatisherrif, up until a few days ago I was playing all of my games on a 3570K @ ~4.5ghz and a single HD4870 512mb.

Here's the thing - I can still get 60fps+ in BF3, even with my ancient video card. I just need to drop the graphics down enough to get my desired framerate (and the game still looks pretty decent). If I had crossfire HD7970's on a Phenom II, I'd be dropping into the 30's no matter what settings I changed.

In the case of I would opt for an i5 and an (ancient) HD4870 over dual 7970's and a Phenom II (resale value aside) because it gives a better experience.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
Food for thought, this was posted yesterday in the Video Cards section.

http://forums.anandtech.com/showthread.php?t=2270271

b3%20ac%20proz%2064.png


BF3 is definitely more requiring than most games out there, but it shows some clear performance tiering. Guild Wars 2 will show similar tiering, though it probably doesn't need need nearly as much CPU as BF3.

dude what's this nonsense? u show us a chart of BF3 @ a resolution of 1024x768?? i mean seriously GTFO with that BS. put it @ 1080p+ resolution and see how everything just becomes completely the same.
 

bononos

Diamond Member
Aug 21, 2011
3,938
190
106
Interesting graph. Ain't that the game that AMD fans swore it run better on FX than i5?

The benchmark looks a little different from others the dual core+HT i3 does better than the FX in 64 multiplayer.
 

Hubb1e

Senior member
Aug 25, 2011
396
0
71
The benchmark looks a little different from others the dual core+HT i3 does better than the FX in 64 multiplayer.

I was wondering the same thing. It's odd how the FX doesn't perform well in this bench. Every other benchmark for BF3 shows that it requires more cores. This is the new expansion pack so maybe they changed the way the resources are threaded. Dual core CPUs from AMD and Intel are much more competitive than they used to be. It actually looks like instead of making it easier on the dual cores, it brought the quad cores down a bit.

dude what's this nonsense? u show us a chart of BF3 @ a resolution of 1024x768?? i mean seriously GTFO with that BS. put it @ 1080p+ resolution and see how everything just becomes completely the same.

Thanks for providing me with my example of someone confused about CPU vs GPU bottlenecks. :\
 

Hatisherrif

Senior member
May 10, 2009
226
0
0
Nobody here is confused about bottlenecks, it is just that some are trying too hard to make it look like the CPU bottlenecking will be so huge it will justify getting a much more expensive system with a weaker graphics card. If your graphics card will not be a card able to do more than 45 fps average, then you should consider getting a CPU that can do just about as much. There are many other factors, as I have already mentioned, like minimum framerates and such, which you should also consider.

@Yuriman I had a Phenom II with my current card (in sig), and I must say I didn't notice any slowdowns or sudden dips in framerates, only when the graphics card was a limiting factor (smoke, fire etc). Just face it, you cannot know how a specific CPU would perform in your situation if you don't test it, or especially if you have never owned it.