Here, have your beloved PcLab:
What's funny is that the nearest price equivalent to the FX-6300 is the i3, which you "accidentally" ommitted, presumably because it managed to match an FX-8350 / FX-6350 and beat an FX-4350 by 10% - even at a whopping 1.2GHz disadvantage:-
http://pclab.pl/art56897-3.html
Simple math, 2-thread CPU needs to have 10 times higher IPC to have as much free processing power as 4-thread CPU in this scenario. It can be looked at from different angle. 2-thread CPU is running at 98,5%[(99+98)/2] of its capability, while 4-thread CPU at 91% [(95+91+91+87)/4]. There is only 1,5% free CPU time in first example, and 9% free CPU time in second. Quick math: First CPU needs to have 9 times more IPC to have the same amount of unused processing power as the second one given the usage scenario.
Your maths are woefully flawed. A 2x IPC is essentially a doubling in a CPU's power whether code is multi-threaded or not. A 10x higher IPC is an increase in CPU power of a staggering 900% - again whether code is well threaded or not. You're basically arguing a 3.2GHz CPU vs the same CPU equivalent essentially OC'd to 32GHz will both have the same % load, and that the extra 10x power will lower CPU usage by only 10%. Um, no... A 10x increase in CPU power IPC / clock speed would lower 80-90% CPU usage to well under 20%. This is already seen in 1990's games that sucked up 99% usage on a 466Mhz Celeron that now barely pull 15-20% on a 3GHz Haswell Pentium today. A 900% increase in CPU performance doesn't = a 10% increase in idle time,
because the load % is not static as IPC / clock speed changes, ie, 2.7Ghz used out of 3.0GHz (90%) is not going to remain 90% when that 2.7Ghz used is now out of a 10x more powerful CPU with "horsepower" equivalent to the same CPU clocked at over 25-32GHz, (to use your example)...
Or did I misunderstood you, and you were trying to say AtenRa is biased towards intel?
You misunderstood me. In fact I think you have me mixed up with someone else as I never said anything about AtenRa being "biased towards Intel". It wasn't me who you originally responded to, either, I merely pointed out the seemingly popular absurdity in "stacking the deck" with running totally unrealistic "background tasks" to 'prove a point'. Eg, in the real world, you can't even ergonomically use Skype & gaming at once simply because your eyeballs won't be able to focus / concentrate on the game whilst using Skype (unless you're cross-eyed).

It's like claiming the advantage of a quad-core tablet is doing a crossword puzzle, watching TV and holding a video conference call all at the same time - It's simply not realistic multi-tasking even on 8/16/32-core CPU purely due to human ergonomics / concentration limitations, and the way all these "typical background tasks" only ever seem to spring up when discussing 2-core Intel's vs 4-core AMD's budget builds yet disappear when i5/i7's come into the equation, is pretty obvious bias. ie, some people are embarrassed that a 2-core Intel i3 beats even 6-8 core AMD chips in some 2014 games even when clocked over 1GHz slower, so try and "move the goalposts" by pretending people on a budget "run more in the background" when gaming than they actually do. In reality the opposite is true - people on a budget with common
"get the most out of your bang-per-buck hardware" sense are those intelligent enough to run "heavy background stuff" when they
aren't at their PC (eg, encoding a queue of videos overnight / when at work, virus scanning when eating dinner, etc) so that even heavy tasks doesn't impact their weaker CPU's when they're sitting in front of them. Common sense time management.
But RAR is not what you consider regular background tasks.
Precisely. Nor does anyone run 2x separate benchmarks simultaneously on any CPU as you wouldn't get an accurate reading on either. Downloading a file in a web browser, listening to MP3's, etc, literally use 1% CPU usage. You hear the same thing with console vs PC : "
Consoles are better than PC's man because Windows and all that stuff on a PC sucks up at least, let's say 30-40% background CPU! Yeah - that's it - 40%!". LOL. What "stuff" would that be exactly? Objective metrics anyone? For heavy future mainstream gaming I'd definitely recommend a quad i5 over an i3, but some of the excuses people make for why "i3's can't run games" (whilst there are hundreds of Youtube FRAPS vids to the contrary) for budget builds by making up unrealistic "typical background task usage" scenarios or comparing $70 Pentium's to double the price $140 FX-6350's is pretty silly.