Blind CPU test: 2700k vs 8150k

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

beginner99

Diamond Member
Jun 2, 2009
5,320
1,768
136
There are many ways besides obvous hardware differences beside CPU to reach such a result.

1) it s actually the truth which is on contrast to all benchmarks

2) The AMD People at the booth knew which one was AMD and influenced the tester to vote for that system. (And this is also why new drugs are tested double-blind, always in contrary to what certain TV shows make you believe)

3) How was the voting done? Marketing is also about psychology. If it "was easier" to vote for AMD, then it will get more votes because there probably is not much of a difference to be seen.

2 and 3 can be combined of course.
 

ed29a

Senior member
Mar 15, 2011
212
0
0
I didn't say "see", I said notice. And you can clearly notice the difference between 30, 60 and 120fps in terms of smoothness.

If you don't 'see', how can you notice it? What senses are being used? Smell? Touch? Taste? Hearing? I am curios, honestly, how people can tell the difference between 60 and 120 fps while the eye and brain can't process anything more than north 30 fps (and I am being generous).
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Did you actually read my post? I explained how.
It's about averaging and how it is done.
 
Last edited:

anikhtos

Senior member
May 1, 2011
289
1
0
so how you can see any diferense when most monitors have 60hz resolutions that is 60fps so playing a game to 60+ frame rate is a waste of resources. there are many games that are above 60 frames for any given procesor. so if you simply want to play that game either procesor is good.
now that does not make up that bd is a failure as a chip.
but always it depents what you want to do with your pc. if you are going to max cpu usage yes then you need the better cpu. like encoding transcoding video as an examble.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
True, vsync is another thing. However, not everyone wants the input lag that comes with vsync. Then there are people with 120hz displays. And finally there can often be dips below 60fps that may occur with CPU A but not with CPU B.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Your brain can't tell the difference. The brain 'memorizes' an image about 1/15th of a second, then it moves to next. So there is no way a human can 'see' 60 fps. Nothing to do with being blind, is just that our own internal computer has limits.

That's rubbish and you know it is - do you play at 15fps? I can clearly see the difference between 60fps and 120fps and I am not alone in this, it's normal.

ed29a - if you can't see more then 30fps then you have a problem with your vision.


Back on topic all AMD did is a *fast enough* test. If that is the aim then they could have used a very cheap processor for comparison which would have also been *fast enough* and a lot cheaper then BD (including a number of athlons). However we buy cpu's to last - we don't want fast enough today, we want as fast as we can get so we don't have to upgrade for a while, and in that test BD fails.
 
Last edited:

wanderer27

Platinum Member
Aug 6, 2005
2,173
15
81
This test proves that modern CPU variants/brands/etc are not as different in real world situations as some review/benchmark sites claim. Too many people think that 10-20% is a huge difference in performance, but its really not unless you're doing some mission critical or high cost task. It took 2500K to convince me to upgrade from Athlon X2, which was like +250% the performance difference or so. :)

Yeah, I'm just now dialing in my new components:
- 2700k
- P8Z68v-pro/gen3
- 16GB
- Win 7 64bit

It is faster and smoother than the System in my sig, but I'm not sure if it's from the new MB or the difference in OS (XP 32 bit vs win7 4bit).

I'm not sure I'm thrilled with the MB though, it was a freaking nightmare getting it to work (UEFI sucks) with most of my existing components.


.
 

Joseph F

Diamond Member
Jul 12, 2010
3,522
2
0
A patch to give skyrim a ~30% performance boost, due to optimisation of code? why? :p

Yeah SkyBoost is one of those things you just gotta get for skyrim.

What I was saying, is that Bethesda should have done this themselves.
Apparently, the 1.4 patch is far faster than even SkyBoost.
So, they've redeemed themselves, IMO.
 

Joseph F

Diamond Member
Jul 12, 2010
3,522
2
0
Yeah, I'm just now dialing in my new components:
- 2700k
- P8Z68v-pro/gen3
- 16GB
- Win 7 64bit

It is faster and smoother than the System in my sig, but I'm not sure if it's from the new MB or the difference in OS (XP 32 bit vs win7 4bit).

I'm not sure I'm thrilled with the MB though, it was a freaking nightmare getting it to work (UEFI sucks) with most of my existing components.


.

I'm with you there. I have an MSI P67A-G43, and after a little while of "Wow, I get to use my mouse in my 'BIOS' :awe:", I wish that they had just gone with a proper, old-fashioned BIOS.
 

Puppies04

Diamond Member
Apr 25, 2011
5,909
17
76
Nah, I kidd. But this an another example of AMD targeting the uneducated masses to make up for BDs failure. Why pay for $ for a product whose price-equivalent competitor is far better?

From what I have seen the difference between a 2700k and a 2500k can vary from small to almost nothing in gaming so if you really want to test 2 cpu's stick the 8150k up agaist the cheaper 2500k not the more expensive 2700k. Then when bulldozer loses there is no "ah but the 2700k is a more expensive cpu" to fall back on
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
From what I have seen the difference between a 2700k and a 2500k can vary from small to almost nothing in gaming so if you really want to test 2 cpu's stick the 8150k up agaist the cheaper 2500k not the more expensive 2700k. Then when bulldozer loses there is no "ah but the 2700k is a more expensive cpu" to fall back on

Actually at that res with a single card they might as well have used a G440.
 

lOl_lol_lOl

Member
Oct 7, 2011
150
0
0
From what I have seen the difference between a 2700k and a 2500k can vary from small to almost nothing in gaming so if you really want to test 2 cpu's stick the 8150k up agaist the cheaper 2500k not the more expensive 2700k. Then when bulldozer loses there is no "ah but the 2700k is a more expensive cpu" to fall back on

Agreed. Intel has relatively little difference in performance across the model range gaming-wise. Just look at 2100 ($120) -> 2500K ($220) -> 2700K ($320) Compared to FX4100 -> FX6100 -> FX8150.

This consistency in the product line ensured we all had access to the best of SB.
 

gevorg

Diamond Member
Nov 3, 2004
5,070
1
0
First you claim something without hard facts. Then you nitpick at an example. It is difficult to lead a sensible discussion this way. And yes, the difference can be quite astonishing. Look at the FX8150 multi-gpu review from HardOCP for instance. 50+% advantage for a 2500K at times. Now stop trolling please and lose the smiley - it is embarassing.

Take it easy dude, you're the one is embarrassing! Senseless accusation, invalid comparison, etc, all in one post. :confused:
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
It's one thing to do a subjective test for something like taste, but to do it for computers is further proof that Bulldozer is a failure. We can measure performance objectively and so we have all of these wonderful websites that do so.
 

Gigantopithecus

Diamond Member
Dec 14, 2004
7,664
0
71
This test proves that modern CPU variants/brands/etc are not as different in real world situations as some review/benchmark sites claim. Too many people think that 10-20% is a huge difference in performance, but its really not unless you're doing some mission critical or high cost task. It took 2500K to convince me to upgrade from Athlon X2, which was like +250% the performance difference or so. :)

As usual gevorg hits the nail right on the head. Unless you are doing seriously computationally intensive work like editing HD video or crunching huge data sets, where CPU differences can add up to differences in time to completion of hours to days (to even weeks), you're simply not going to notice a difference between $200 and $1,000 CPUs. For 99% of users, a 2500K is enough. Hell, for 95% of users, a 2100 is enough.

That's rubbish and you know it is - do you play at 15fps? I can clearly see the difference between 60fps and 120fps and I am not alone in this, it's normal.

ed29a - if you can't see more then 30fps then you have a problem with your vision.

Do you mean you can distinguish 60fps constant from 120fps constant? Or that you can distinguish 60fps on average from 120fps on average?

...As a physical anthropologist, I have done informal research (i.e. not publication grade but still useful) on variation in the ability of the brain to distinguish different frame rates. There is variation in sensitivity - some people can't tell 20fps from 30fps, while others can distinguish 40fps from 50fps. Few people can tell 50fps from 100fps. Constant. What most people can notice, however, are brief periods of time where frame rates drop - for whatever reason, if you intersperse brief moments of 20fps in mostly 60fps videos, the overall subjective effect is that it looks worse than watching a constant 30fps video.
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
No it doesn't prove that at all. This was one (1) game they tested and it was mainly GPU-bound due to eyefinity resolution. Had they made this "test" with 20 games of different genres, the conclusion would be more valid.

So are you saying that the Anandtech review of the FX-8150 was invalid? They only tested 9 games IIRC.


Edit:

I'm curious about all of you who talk about who performance is completely objective and subjective difference should be ignored:

Would you be happier with a micro-stuttering dual GPU setup that makes 100 fps average, or a single faster GPU setup that offers 95 fps in the same situation?
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
More valid than this "test". And I don't read only one review before buying something, I read a dozen ;)
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
I believe Anand should do a recap review of the 8150 with all the MS patches and Bios updates to see if there are any changes.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
As usual gevorg hits the nail right on the head. Unless you are doing seriously computationally intensive work like editing HD video or crunching huge data sets, where CPU differences can add up to differences in time to completion of hours to days (to even weeks), you're simply not going to notice a difference between $200 and $1,000 CPUs. For 99% of users, a 2500K is enough. Hell, for 95% of users, a 2100 is enough.



Do you mean you can distinguish 60fps constant from 120fps constant? Or that you can distinguish 60fps on average from 120fps on average?

...As a physical anthropologist, I have done informal research (i.e. not publication grade but still useful) on variation in the ability of the brain to distinguish different frame rates. There is variation in sensitivity - some people can't tell 20fps from 30fps, while others can distinguish 40fps from 50fps. Few people can tell 50fps from 100fps. Constant. What most people can notice, however, are brief periods of time where frame rates drop - for whatever reason, if you intersperse brief moments of 20fps in mostly 60fps videos, the overall subjective effect is that it looks worse than watching a constant 30fps video.

Fwiw, these are my observations and preferences :

Gaming on PC, I can tell a distinct difference between constant 60, 60 with drops, variable 30-60, 30 constant, 60-120 variable, 120 stable. Obviously it'd be nice to be able to play at 1920x1080 @ max details at 120 constant, it'd be awesome, but practicality even my decent 2500k PC I go with 60/Vsync.

As for TVs, I can tell a huge difference between 60 and 120hz sets. I actually like the 120hz for things like Pixar films, but other than live sports, I find the effect kind of unsettling with normal content. I can't tell a difference at all with the 240 and higher sets.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
That's the point. Test the systems against each other without brands and numbers. People will NEVER notice if it's 40fps or 400fps if there is no counter in the corner. Anyone telling you the opposite is talking out of his ass.
Interestingly enough, since my friend thought like you we organized a blind test. I was able to tell which framerate "bracket" a game was running exactly one hundred percent of the time using values of 45/60/75/90/120/140/160.

Desktop framerates were a little bit tougher to get as fine grained, but its idiotically easy to pick out when its running at 60fps.

Fun fact: After a few minutes Windows 7 throttles most Aero UI activities such as "Aero Peek" down to 60FPS until you grab a window and drag it around your screen.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Complete and utter BS by AMD. They should simply stop embarrassing themselves more. Zambezi sucks at gaming and it should be left at that. If you have less than $150 for a gaming CPU, you buy an i3. If you have more, you buy an i5. Getting an FX chip for gaming is nonsensical.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Complete and utter BS by AMD. They should simply stop embarrassing themselves more. Zambezi sucks at gaming and it should be left at that. If you have less than $150 for a gaming CPU, you buy an i3. If you have more, you buy an i5. Getting an FX chip for gaming is nonsensical.

Yeah. I find most (not all, mind you) people who either 'really want' or already have BD are the so-called 'IT guy' that doesn't know jack. They want MOAR COREZ and think AMD is still leading Intel in performance like it was 7-8 years ago, and that all decent Intel CPUs cost $1000. Basically living in the past and/or denial. They say arguing with a fool...
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Yeah. I find most (not all, mind you) people who either 'really want' or already have BD are the so-called 'IT guy' that doesn't know jack. They want MOAR COREZ and think AMD is still leading Intel in performance like it was 7-8 years ago, and that all decent Intel CPUs cost $1000. Basically living in the past and/or denial. They say arguing with a fool...

Very true, and I know this first hand from working a seasonal job at CompUSA. I can't believe how many people would see the FX-8150 and think "this is such a beast!!!", only to be disappointed when I pointed out that clock-for-clock the new CPUs are 10-20% slower than the old ones and they're barely faster in multi-threaded. Then I'd point out the power consumption when they're OCed and most of them would be instantly turned off. The people that already had AMD motherboards I simply pointed to the 1035T (OEM, $130), the 1055T ($155) and the 955 ($125). Unless they were looking for a cheap rendering or video encoding rig, Intel all the way. Even then, most of the people looking for a machine for rendering or multi-threaded applications had some dough so I got them to spend on the 2600K.

Luckily for me, since I can easily convince people :)awe:) I ended up selling around 25 2500Ks, 15 2400s, and around 30 2600Ks. I sold no FX CPUs since they were overpriced. For the price of the 6100, you could get the superior 1055T. For the price of the 8120, you could get the superior 2500K, and for the price of the 4100 you could get the superior 955.