• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Blind CPU test: 2700k vs 8150k

I guess I will probably be the only forum member that will actually test out this "smoothness". The only issue would be me trying to sell this Bulldozer and losing my rear on it.


EDIT: I didn't think there was anything smooth about Llano's CPU performance.
 
Last edited:
I think it's an important point that tends to get glossed over sometimes, for high resolution, GPU limited gaming, the CPU usually doesn't really matter that much. Although there are some exceptions, even with an HD 6970, Skyrim @ 1920x1080 Ultra High can still get pretty CPU limited and have lower FPS in some of the cities, for example. Of course part of that problem is it just being terribly optimized, such not being able to use more than two cores, the developers not enabling SSE calls when they compiled it that can improve performance upwards of 40%, etc.

2500K is still the better by far the better gaming and general usage CPU in the $200-250 price range IMO, though. For example FX-8150 is fine until you hit one of the situations I mentioned above where its deficiencies in single/lightly threaded performance becomes glaringly obvious.
 
Just trying to feel better about my FX-6100

I have a 6100 as well - running at 4200 MHz with turbo enabled and set to 4700 - For my current work usage I'm fine with this setup.

I don't know yet what my next move will be, but I do like the thought of the eight SATA3 connectors available on the upcoming AMD 1090 chipsets
 
I think it's an important point that tends to get glossed over sometimes, for high resolution, GPU limited gaming, the CPU usually doesn't really matter that much. Although there are some exceptions, even with an HD 6970, Skyrim @ 1920x1080 Ultra High can still get pretty CPU limited and have lower FPS in some of the cities, for example. Of course part of that problem is it just being terribly optimized, such not being able to use more than two cores, the developers not enabling SSE calls when they compiled it that can improve performance upwards of 40%, etc.

2500K is still the better by far the better gaming and general usage CPU in the $200-250 price range IMO, though. For example FX-8150 is fine until you hit one of the situations I mentioned above where its deficiencies in single/lightly threaded performance becomes glaringly obvious.

I can't believe that Bethesda won't release a patch that does what SkyBoost does. How long would it take their huge, paid team of developers to do this? Two, three days? Considering that one unpaid guy did it in his spare time in a few weeks. It's absolutely disgusting.
 
I think the 1.4 patch will include the same optimizations SkyBoost does. It's currently beta so you have to manually opt-in, to access it in Steam go to Settings, click on the Account tab, then under Beta participation click on Change and select Skyrim Beta in the window that pops up. That should download the 1.4 beta patch. If you're running other mods I don't know if the new patch breaks them, though, so download at your own risk.

It is kind of ridiculous that these performance optimizations weren't included when the game shipped, but oh well, it is what it is. At least they're listening to the community and fixing stuff, like making the game 4GB address aware and adding SSE optimizations. They probably have a lot of testing and validation to do before rolling out a patch to make sure their fixes don't inadvertently break something else, it's unrealistic to expect them to roll out these patches in a matter or days or weeks.
 
Last edited:
They should enable Fraps with all their comparisons. The outcome would be totally the opposite. :biggrin:
 
Just tried the 1.4 beta patch via steam. 68% performance increase compared to 1.3, still 20-30% over skyboost. 😀

Cool, what areas did you test? Might give it a try!

I'm already fine with the performance of the game though so I probably won't risk breaking anything. Using skyboost, an HD5770, 2500k@4.2GHz everything is very smooth at 1920x1200 4x/16x maxed

I played Skyrim for probably 40-45 hours on an AMD x4 955 @ 3.6, and when I fired it up again with a new CPU my 2500k was very noticeably smoother even at stock frequency, contrary to these results!
 
Last edited:
This test proves that modern CPU variants/brands/etc are not as different in real world situations as some review/benchmark sites claim. Too many people think that 10-20% is a huge difference in performance, but its really not unless you're doing some mission critical or high cost task. It took 2500K to convince me to upgrade from Athlon X2, which was like +250% the performance difference or so. 🙂
 
Cool, what areas did you test? Might give it a try!

I'm already fine with the performance of the game though so I probably won't risk breaking anything. Using skyboost, an HD5770, 2500k@4.2GHz everything is very smooth at 1920x1200 4x/16x maxed

I played Skyrim for probably 40-45 hours on an AMD x4 955 @ 3.6, and when I fired it up again with a new CPU my 2500k was very noticeably smoother even at stock frequency, contrary to these results!
Markarth (pretty much anywhere in the city) and at the top of the stairs leading to Dragonsreach looking out over Whiterun are good places to test, FPS can get pretty bad. 2500K @ 4.2 probably gets like 40FPS+ even without the patch, though, not bad enough to really impact your gaming experience negatively. But weaker CPUs can dip to like 15-20FPS in these areas.
 
Last edited:
You can't realize any short comings visually so what's the problem with are product!? NARPH!!

Hey AMD, nobodies trying to tell you you're not capable of making a functioning CPU, it's kind of your thing. You're just incapable of establishing a PR teams that get anyone to give a damn. You had the chance to market yourself on every xbox 360 console ever made, and you didn't do a damn thing with it. Its blasphemy AMD is not a common household name you're entirely capable, but if you can't get people within the industry to support the idea you've gotta do it yourself which you fail to do. A double blind test is like trying to defend yourself against blind hatred instead of facing the facts.
 
This test proves that modern CPU variants/brands/etc are not as different in real world situations as some review/benchmark sites claim. Too many people think that 10-20% is a huge difference in performance, but its really not unless you're doing some mission critical or high cost task. It took 2500K to convince me to upgrade from Athlon X2, which was like +250% the performance difference or so. 🙂

No it doesn't prove that at all. This was one (1) game they tested and it was mainly GPU-bound due to eyefinity resolution. Had they made this "test" with 20 games of different genres, the conclusion would be more valid.
 
No it doesn't prove that at all. This was one (1) game they tested and it was mainly GPU-bound due to eyefinity resolution. Had they made this "test" with 20 games of different genres, the conclusion would be more valid.

The number of games is not the point, there is not even a need of what AMD done here to realize that differences in modern CPUs is insignificant for most users.
 
It is the point because it highly depends what you play. In shooters, the difference is generally smaller. But in online-RPGs, strategy and badly optimized games it can become pronounced.

You just cannot claim something without the basis to back it up. It's like driving a normal passenger car and a SUV under certain conditions where the driving experience seems to be the same and then saying: passenger car=SUV, no discernible difference. But you know what happenes once you go off road...
 
Are you saying the difference between Intel and AMD CPUs is like the difference between a passanger car and SUV? 🙄
 
First you claim something without hard facts. Then you nitpick at an example. It is difficult to lead a sensible discussion this way. And yes, the difference can be quite astonishing. Look at the FX8150 multi-gpu review from HardOCP for instance. 50+% advantage for a 2500K at times. Now stop trolling please and lose the smiley - it is embarassing.
 
They should enable Fraps with all their comparisons. The outcome would be totally the opposite. :biggrin:

That's the point. Test the systems against each other without brands and numbers. People will NEVER notice if it's 40fps or 400fps if there is no counter in the corner. Anyone telling you the opposite is talking out of his ass.
 
Maybe YOU don't notice the difference. 40 vs. 400fps? Dear god, who doesn't notice that is blind 😛
In all seriousness: There are games where you can notice even 60 vs. 120fps, most notably UE3 games. I don't know what it is about this engine but there is a difference there. Also, why does it have to be 40 vs. 400? It can also be 20 vs. 30. And you cannot tell me that that is not noticeable.

And btw, why is AMD marketing BD as a gaming CPU then? By their own logic, an Athlon X4 or Phenom X4 would suffice. Or maybe even an Athlon X2 if people are supposedly too "dumb" not to notice the difference. Kinda stupid if you think about it. Well, AMD...
 
Last edited:
Maybe YOU don't notice the difference. 40 vs. 400fps? Dear god, who doesn't notice that is blind 😛
In all seriousness: There are games where you can notice even 60 vs. 120fps,

Your brain can't tell the difference. The brain 'memorizes' an image about 1/15th of a second, then it moves to next. So there is no way a human can 'see' 60 fps. Nothing to do with being blind, is just that our own internal computer has limits.
 
I didn't say "see", I said notice. And you can clearly notice the difference between 30, 60 and 120fps in terms of smoothness. Depends on the person and the game, though. Games with 1st person view are perfect for this because you immediately see the results when you move your mouse, as you can focus on the movement of the cursor on the screen. I guess it would be more difficult for other types of games.

The eye is averaging of course. But the more input data (=more fps) you have, the better does this average represent what is actually happening. Compare it to antialiasing. The pixel raster of your screen is finite. Yet more AA-samples result in less jaggies because the colors of the pixels can be adjusted finer based on this information to resemble what it should look like - smooth without any jaggies.
 
Last edited:
I can't believe that Bethesda won't release a patch that does what SkyBoost does. How long would it take their huge, paid team of developers to do this? Two, three days? Considering that one unpaid guy did it in his spare time in a few weeks. It's absolutely disgusting.

A patch to give skyrim a ~30% performance boost, due to optimisation of code? why? 😛

Yeah SkyBoost is one of those things you just gotta get for skyrim.
 
You missed the point completely guys, what they meant by 'Blind CPU Test' was that the participants were all blindfolded. This is why there wasn't any 'noticeable' difference in performance.

Nah, I kidd. But this an another example of AMD targeting the uneducated masses to make up for BDs failure. Why pay for $ for a product whose price-equivalent competitor is far better?
 
Back
Top