[HardwareHaven] New Review shows FX-8150 beating i7 2600k at gaming

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
PerfSummary-11.jpg

PerfSummary-21.jpg

:thumbsup: Great summary.

Now just imagine that more likely than not a GTX670/680 and HD7950/7970 will be as fast as an HD6870 CF. So in basically 12 months from now, BD is going to be a poor choice for gaming against a 3 year old i7-920 @ 3.8ghz.....nevermind against a $220 version of Ivy Bridge with 5.0ghz+ overclocking potential, 5%+ IPC increase and even less power consumption than 2500k! AMD will need to cut prices by $100 to $179.99 for the FX-8150 when a $220 Ivy launches.

Anyone ready for a laugh?

http://www.microcenter.com/storefron.../fx_index.html

FX-6100 = $199.99
FX-8120 = $239.99
FX-8150 = $279.99

vs.

i5-2500k = $179.99

Vote AMD: 200W+ higher power consumption at 4.8ghz vs. a 4.7ghz i5-2500k, slower performance, and $100 higher price.

Retail pricing FTW! There is only 1 conclusion: The tin can is worth $100 more over the cardboard box of the 2500k.
 
Last edited:
Feb 19, 2009
10,457
10
76
Nope, heres a gamer right here, his name is Maximilian... Ive had my i7 for nearly 2 years now, the architecture and model itself is ~3 years old. Many gamers are still on their Q6600's and Q9xxx's and why not? These CPUs were very good in their time and still hold up well today. Not to sound like im bashing AMD here but theres almost noone with a phenom I CPU because it wouldnt hold up well today at all for gaming, bulldozers of today will be a similar story in 2+ years.

Look at peoples sigs all over the forums, there's plenty of people still on core2's and first generation i7's. Gamers upgrade when they need to upgrade mostly.

Phenom IIs perform better than C2D and C2Q in most games and are cheaper since release years ago.

The point is that gaming moves very slowly in terms of pushing the CPU boundary.

We aren't even close to pushing the limits of DX11 in games and even a 6990 or gtx590 struggles with 1 monitor. Think its going to be any different in the future? Suddenly new next-gen PC games just won't improve on graphics and the focus shift to CPU bottleneck?
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
the whole point of doing a CPU review is to show its strengths in a situation where the cpu ipc makes a difference.

if one cpu scores higher in low res it will scale higher with more cards and more res.

do you realize benching at low res and low detail doesn't make a different right ? unless you like gaming at that awful detail and full of jagged lines.

no, it will just show you the maximum CPU can process a frame until it get bottlenecked by GPU.

and i do realize that BD need a more powerful turbocore like around 4,5 Ghz
 

felang

Senior member
Feb 17, 2007
594
1
81
I appreciate your charts, and look forward to the 580 SLI etc. I like how many games/benchmarks you are running, thanks for that.

I think what max is saying is that given two choices within a price band (AMD 8150 versus 2500K), if both are GPU limited at 1080p+ with high AA (which they are), how does one, with gaming as a primary need, decide which is the better value? I would say:

a) More powerful in gaming - tested by removing GPU bottleneck (lower resolutions) - somewhat useful as a predictor of future gaming performance when faster GPUs come out
b) Performance/watt and power use/heat considerations
c) Overclocking (if applicable to user)

Based on these, it's really hard to choose Bulldozer over the 2500K, especially since the 2500K is significantly cheaper on sale. For a gaming rig, I do not understand why anyone would consider buying Bulldozer unless they were upgrading from some really old/slow Phenom II but somehow already had an AM3/AM3+ mobo.

Building a new system as a gamer it is a no-brainer to go the Intel route.

+1 Very well said!
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
Bulldozer is not a CPU a gamer should consider. Fact.

I have no objection to real situation testing in what appoppin is doing, but the fact he is touting bulldozer as a gaming CPU and "beating the 920" is outright wrong. Bulldozer does not even beat the stock 920 as shown on anandtechs very own benchmarks. The 3 year old nehalem, it gets worse when compared to its direct competitor today sandy bridge.

so what is your criteria for "a gamer cpu" ?


so you basically saying that winning in low res and awful detail is more important than higher resolution and breathtaking detail ? :colbert: hmm are you actually a gamer?
 

jacktesterson

Diamond Member
Sep 28, 2001
5,493
3
81
Do these Tests at stock folks!!!!

Then do O/C for fun.

And the i7 920 comparison... why is it O/C to 4.2 GHz and the 8150 is stock in your review?
 

Maximilian

Lifer
Feb 8, 2004
12,603
9
81
Phenom IIs perform better than C2D and C2Q in most games and are cheaper since release years ago.

The point is that gaming moves very slowly in terms of pushing the CPU boundary.

We aren't even close to pushing the limits of DX11 in games and even a 6990 or gtx590 struggles with 1 monitor. Think its going to be any different in the future? Suddenly new next-gen PC games just won't improve on graphics and the focus shift to CPU bottleneck?

Phenom II isnt really a factor, it came out years after the first C2Q arrived on the scene and it was priced reasonably competitively. Phenom II offered decent gaming performance from december 2008 onwards. Bulldozer offers similar performance in games from october 2011 onwards. Today's implementation of bulldozer will not last anywhere near as long as nehalem or sandy bridge for a gamer, it will be bottlenecked and become obsolete before even the 3 year old nehalem becomes a bottleneck.

This is why CPU testing is done at low settings, to simulate how well the CPU will do in the future with better GPUs when the GPU does not hold the CPU back.

Bulldozer offers similar performance to phenom II in games.

Bulldozer sucks more power.

Bulldozer costs more.

Not saying it wont get better with future implementations but what we have today is not a good gaming CPU and will not offer the longevity that sandy bridge can offer for less money.
 

Maximilian

Lifer
Feb 8, 2004
12,603
9
81
so what is your criteria for "a gamer cpu" ?


so you basically saying that winning in low res and awful detail is more important than higher resolution and breathtaking detail ? :colbert: hmm are you actually a gamer?

Read above post. Learn the meaning of GPU limited benchmark.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Why are you comparing a 5.2GHz 2600k vs a 4.76GHz 8150. Duhh, we already know the 2600k has better IPC.

Those % gaps are massive. Even if performance was perfectly linear with clockspeed (it isn't), lowering the 2600k clock to 4.6ghz for the gaming benches shows massive disparity between BD and SB when not GPU limited. This definitely means that as time goes by the BD (today's models at least, hopefully future ones are dramatically improved) will suffer much faster.

Btw, just multiply the 2600k results by .88, and you'll get 4.6ghz results, though in the real world it would probably be closer to .90 given that scaling isn't 100% with clockspeed.

FWIW, I agree, using common air overclocks would have been preferable to me, perhaps both were max stable oc under water? 4.6ghz is cake for 2600k, I've never seen one at or over 5ghz without more voltage and cooling than make sense to me though. I'm a cheap guy, so if it needs more than a cheap tower cooler w/120mm fan and near-stock volts, it doesn't really interest me that much.
 
Last edited:

keto

Junior Member
Sep 20, 2011
3
0
0
Those % gaps are massive. Even if performance was perfectly linear with clockspeed (it isn't), lowering the 2600k clock to 4.6ghz for the gaming benches shows massive disparity between BD and SB when not GPU limited. This definitely means that as time goes by the BD (today's models at least, hopefully future ones are dramatically improved) will suffer much faster.

Btw, just multiply the 2600k results by .88, and you'll get 4.6ghz results, though in the real world it would probably be closer to .90 given that scaling isn't 100% with clockspeed.

FWIW, I agree, using common air overclocks would have been preferable to me, perhaps both were max stable oc under water? 4.6ghz is cake for 2600k, I've never seen one at or over 5ghz without more voltage and cooling than make sense to me though. I'm a cheap guy, so if it needs more than a cheap tower cooler w/120mm fan and near-stock volts, it doesn't really interest me that much.

Yes. Some people want comparos with current games at gaming settings, well you need to get into SLI/Xfire to reach past the single GPU bottleneck. There you have it, in SLI you can see where the CPU becomes the holdup. This doesn't seem a difficult concept to grasp, unless I am the thick one missing the other side of the argument.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
3 cards in crossfire and the 8150 now shows as the bottleneck. Can't even get over 100fps in Mafia 2.

Wow, I'd be pissed if I paid $1000 for 3 videocards to see that my brand new $290 CPU is the bottleneck!! Not to mention how big of a power supply I'd need to power 3 cards and an overclocked FX 8150.D:D:D:
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
How many "gamers" are going to keep their CPUs for 2+ years?

In 2+ years "gamers" would probably have upgraded the CPU anyway wouldn't they? And if isn't someone who upgrades their CPU that often they probably wouldn't be buying the absolute high end cards anyway IMO, rendering them GPU limited.

My i7 920 @ 4.0 GHz is running fine on the 3rd year.
Before that I sported a Q6600.
It will run fine until "Haswell".
I can't say the same for my GPU...strated with a GTX285...rocking a GTX580 now.

As a gamer I replace my CPU once ever 3-5 years...but my GPU gets replaced a lot more.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Phenom IIs perform better than C2D and C2Q in most games and are cheaper since release years ago.

I see this being thrown around on many occasions.

1) Why do people still keep bringing up Core 2 Duo vs. Phenom II? Can you please tell me? I bought my E6400 @ 3.4ghz in Fall of 2006 for $200 or so. In other news, Pentium 3 loses to an Athlon 64. The year is almost 2012 FFS! Oh look, a 2011 Phenom II X4 960T is faster than an E8400.....wow.

Also, hardly any serious gamer who plays modern games is using a Core 2 Duo with a modern GPU like a 6950/6970/6990/GTX560Ti/570/GTX470/480/580, etc. And if they are, they are losing a TON of performance in modern games. C2D should not even be discussed since it was obsolete when GTAIV, BF2, Civilization 5, F1 2010, Starcraft 2, Arma series and Resident Evil 5 arrived.

2) Phenom II is not better than Core 2 Quad in games. The reason you see Q8300/Q9400/Q6600/Q6700/Q9550 getting spanked is because modern reviews run them at stock speeds. But I don't know anyone who bought those CPUs and didn't overclock.

What happens when you compare a 3.7ghz Phenom II X4 vs. a 3.4-3.5ghz Q6600, 3.6ghz Q8300, 3.8ghz Q9550? Oh ya.....about that:

http://www.xbitlabs.com/articles/cpu/display/phenom-ii-x4-920-overclocking_8.html#sect0

crysis.png

fc2.png

ut3.png

wic.png

l4d.png


The point is that gaming moves very slowly in terms of pushing the CPU boundary.

Depends on what games you play. Dual core CPU for BF2/BF3 or older architectures for WOW, Starcraft 2? Forget about it.

We aren't even close to pushing the limits of DX11 in games and even a 6990 or gtx590 struggles with 1 monitor. Think its going to be any different in the future? Suddenly new next-gen PC games just won't improve on graphics and the focus shift to CPU bottleneck?

Future games will be even more GPU and CPU limited (of course more emphasis on GPU limited if we get gobbles of Tessellation and Depth of Field effects in 2012)

3 Things on CPUs:

1) Physics and AI in games will become more complex, placing even more emphasis on CPU in the next 2-3 years (and of course on the GPU).

2) Diablo 3, 2 Starcraft expansions, millions of people playing WOW. Should all these users ignore the massive advantage Core i5/i7 CPUs have in these games?

3) Minimum frames in modern games:

Gothic 4
2500k = 42
FX-8150 = 28

Civilization 5
2500k = 36.3
FX-8150 = 25.6

F1 2010
2500k = 56
FX-8150 = 48

Resident Evil 5
2500k = 131
FX-8150 = 98

Two Worlds
2500k = 82
FX-8150 = 52

So let me ask you, why would I pay $230 for FX-8120 and $280 for FX-8150 when both are slower and consume 2x the power?

Why would I pay those prices when SB gains even more performance when overclocked while BD's power consumption grows another 200W in overclocked states.....:whiste:

Bulldozer 8-core chips make no sense, esp. in light of a $170 X6 1090T and $220 2500k. Price needs to come down $100 for the FX-8150 model.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
2,7% increase.


really?



Stop defending it - we're talking about Zambezi, not a Bulldozer+ or Piledriver.

No one is arguing against my points, with all i see.

They sidestep bringing into the GPU.



Argue against the fact it's Pure Horsepower is weak as hell instead of "OH WELL IT DOESN'T MATTER ANYWAY GPU'S CONTROL ALL ANYWAY!".


In real life 99% won't feel much difference, but all of you fanboys that can't accept it didn't even remotely deliver competition or near what was promised by AMD - just humiliate yourself further.

Even if future iterations fix it, won't as hell save the current one.
Period.

Whoa, calm down there.

Fanboy? Well I'll be getting a 2500/2600k in a month or so. I've already given back the 990FX board for the Z68. I'm using Intel right now so don't call me a fanboy. I'm just saying that it has a little potential, even if it isn't much.

I'm primarily a gamer so an FX-8150 is not very apealing to me.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Why are you comparing a 5.2GHz 2600k vs a 4.76GHz 8150. Duhh, we already know the 2600k has better IPC.

Current pricing on FX-8150 is $280 vs. $315 (newegg pricing) for 2600k. At these prices, they are direct competitors and FX-8150 looks like a joke in comparison. If it cost $180-190, then we can have a discussion. AMD is out to lunch pricing it higher than a 2500k, and $100 more than a 1090T.
 

Piano Man

Diamond Member
Feb 5, 2000
3,370
0
76
Comparing CPU performance at 2560x1600 with all the goodies turned on is like comparing car performance during rush hour traffic in Los Angeles. Sure, it may be my real world experience, but its a shit test for determining which car is faster.
 

Maximilian

Lifer
Feb 8, 2004
12,603
9
81
Comparing CPU performance at 2560x1600 with all the goodies turned on is like comparing car performance during rush hour traffic in Los Angeles. Sure, it may be my real world experience, but its a shit test for determining which car is faster.

lol good analogy :thumbsup:
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
It's not a ridiculous internet debate without a car analogy. That is a Fact.

Imo, there are actually good reasons why this happens. What we do in these forums is similar to bench racing.
1. To discuss the possible quarter-mile elapsed time (E.T.) of a car based on a list of modifcations or horsepower estimate.
2. To discuss the estimated output (in horsepower) of one engine versus another based on lists of modifications done to each engine.
3. To discuss "which is faster?" or "Which would win in a race?" between two cars, based on 1 and 2 above.
lol, maybe it's just me :)
 

grkM3

Golden Member
Jul 29, 2011
1,407
0
0
do you realize benching at low res and low detail doesn't make a different right ? unless you like gaming at that awful detail and full of jagged lines.

no, it will just show you the maximum CPU can process a frame until it get bottlenecked by GPU.

and i do realize that BD need a more powerful turbocore like around 4,5 Ghz

you do realise that we are looking at a cpu review and not a gpu review right?
 

blckgrffn

Diamond Member
May 1, 2003
9,130
3,071
136
www.teamjuchems.com
Imo, there are actually good reasons why this happens. What we do in these forums is similar to bench racing.

lol, maybe it's just me :)

It's also one of the laughable consistencies of the topic at hand. Car analogies always fail because the same people who want to discuss cache latency either know way too much about cars and start attacking the analogy or the analogy is just plain wrong and hilarious.

"Particle's First Rule of Online Technical Discussion:
As a thread about any computer related subject has its length approach infinity, the likelihood and inevitability of a poorly constructed AMD vs. Intel fight also exponentially increases.

Rule 1A:
Likewise, the frequency of a car pseudoanalogy to explain a technical concept increases with thread length. This will make many people chuckle, as computer people are rarely knowledgeable about vehicular mechanics.

Rule 2:
When confronted with a post that is contrary to what a poster likes, believes, or most often wants to be correct, the poster will pick out only minor details that are largely irrelevant in an attempt to shut out the conflicting idea. The core of the post will be left alone since it isn't easy to contradict what the person is actually saying.

Rule 2A:
When a poster cannot properly refute a post they do not like (as described above), the poster will most likely invent fictitious counter-points and/or begin to attack the other's credibility in feeble ways that are dramatic but irrelevant. Do not underestimate this tactic, as in the online world this will sway many observers. Do not forget: Correctness is decided only by what is said last, the most loudly, or with greatest repetition.

Rule 3:
When it comes to computer news, 70% of Internet rumors are outright fabricated, 20% are inaccurate enough to simply be discarded, and about 10% are based in reality. Grains of salt--become familiar with them.

Remember: When debating online, everyone else is ALWAYS wrong if they do not agree with you!

Random Tip o' the Whatever
You just can't win. If your product offers feature A instead of B, people will moan how A is stupid and it didn't offer B. If your product offers B instead of A, they'll likewise complain and rant about how anyone's retarded cousin could figure out A is what the market wants."

Ripped shamelessly from this guys sig:

http://www.xtremesystems.org/forums/member.php?80341-Particle

And so applicable to our current situation.
 
Last edited: