• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Rumour: Bulldozer 50% Faster than Core i7 and Phenom II.

Page 110 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Right, thats because AT Bench doesnt have or use, Futuremarks PCMark TV & Movies sub-test, Cinebench R11.5 and Futuremarks 3DMark06 CPU-test that the slide BASES the projections on and clearly states...with maximum zoom 🙂 at the bottom of the slide.

I thought only Bapco was guilty of stitching together bits and pieces of benched apps in a way that was shady and fraught with suspicious integrity 😕
 
I thought only Bapco was guilty of stitching together bits and pieces of benched apps in a way that was shady and fraught with suspicious integrity 😕

Do people make buying decisions or even conclude reviews based on PR slides? 😕

I guess this only shows with the right pieces you can paint whatever picture you want.
 
Last edited:
AMD Phenom X6 1100T ($189,99) is very competitive against Intel Core i7 950($259,99) both in CPU/Platform price and performance in Multithreaded apps and Gaming (High res with filters enable).

You can argue all day long that Core i7 950 has better single thread performance(higher IPC) but for home use and Gaming both CPUs are equals with the exception that socket 1366 is a dead horse and AM3+ can be upgraded with BD in the near future.
 
AMD Phenom X6 1100T ($189,99) is very competitive against Intel Core i7 950($259,99) both in CPU/Platform price and performance in Multithreaded apps and Gaming (High res with filters enable).

You can argue all day long that Core i7 950 has better single thread performance(higher IPC) but for home use and Gaming both CPUs are equals with the exception that socket 1366 is a dead horse and AM3+ can be upgraded with BD in the near future.



Just......no. But if it makes people feel better that the Flagship chip is even in discussion with a mid-range chip from a generation ago from the competition......meh.
 
Just......no.

Anything to support that simple no ???

I have shown in an other topic that in Multithread apps Phenom X6 1100T is equal to Core i5 2500K and Core i7 950.

As far as gaming goes, with High res (1920x1080 and up) with filters On (4x AA and up) the differences are less than 3-5%.
 
I will have to say here that not all games are equal,

Most RTS games need a high performance CPU and in most cases Core i7 is clearly the better choice, but in FPS games most of the time the GPU makes the difference. 😉
 
Just......no. But if it makes people feel better that the Flagship chip is even in discussion with a mid-range chip from a generation ago from the competition......meh.

Weren't you the same guy who posted this?

What Intel marks as "Mainstream" or "Performance" in their slides is just pure marketing. It's how they perform that matters.

Labels are irrelevant according to you. So why the change of heart?
 
I will take on the x6 1100-T in any bench mark. You O/C your 1100-T and I will smash it with my very conservitive 2500k .

Put up or shut up!

Your first bench would be Cineabench 11.5 I score 7.14 in the x86 64bit test.

3Dmark 6 I get 28,130 thats with a NV 560 Ti in dmode
in imode I get 27654. Bring on that x6 1100T

Super Pi 8.404 seconds
 
Last edited:
Internet tough guy with a processor, and he's not afraid to use it! Fear the Super Pi score. :awe:

Well ya. This is my browser . I not comparring it to like my gamer or anything like it . I keep saying my gamer . I should say the wifes. I love this thing for browsing . I love having intel managing the processor rather than a M/B bios.
I just like the speed I have on the net . Its remarkable really. So ya this is my browser and its a FAST browser.
 
For real?
The facts as we know them:
1.) Nemesis makes an open challenge.
2.) Nemesis posts his scores.
3.) There are no screenshots or any attempt at validation.
4.) Ben90 responds with his result, exactly .001 second better, thereby conquering the posed challenge.

I submit that the only conclusion possible is that it is absolutely for real. There can be no other interpretation.



I will take on the x6 1100-T in any bench mark. You O/C your 1100-T and I will smash it with my very conservitive 2500k .

Put up or shut up!
I think nobody here is arguing the fact that the 2500K will beat the 1100T in a lot of (or most?) benchmarks. That wasn't the point of AtenRa (I assume it was his post that motivated you to pose the challenge). From his post, what I gather is that in GPU-bound (or near-GPU-bound) scenarios, the difference between the two is extremely small (we know this is true, yes?), and he further postulates that this scenario is common enough for the usage patterns of enthusiasts.

In short, he was not saying the 1100T is superior or at the same league as the 2500K. Rather, I take from his posts that in the case of gaming, since most people are GPU-bound anyway, their difference really isn't that big as most benchmarks (that intentionally emphasize CPU difference) would indicate, thereby implying that for gamers, some benchmarks that particularly emphasize the CPU differences won't actually be indicative of real life settings.

Given that, there really was no need to pose the challenge. We know you'll win, of course, but that wasn't the point being previously made.

(I have a Thuban, but I won't even try to participate, because I know the 2500K is way out of my processor's league here)
 
In short, he was not saying the 1100T is superior or at the same league as the 2500K. Rather, I take from his posts that in the case of gaming, since most people are GPU-bound anyway, their difference really isn't that big as most benchmarks (that intentionally emphasize CPU difference) would indicate, thereby implying that for gamers, some benchmarks that particularly emphasize the CPU differences won't actually be indicative of real life settings.

Something I mentioned a couple of days ago here or another forum. There is no killer app any more. Specially not in gaming since Crysis. In that sense with an up to date GPU any modern CPU from a Core2Dou to an Athlon II X4 is more then enough CPU for any game.

In games where the CPU has at general resolutions a measurable affect on performance, performance is so high that it doesn't matter anyway. Where on games where the performance of the game is borderline unplayable on the machine, its going to be upgrading the video card that is going to fix that.
 
The per core performance of the cpu can effect some games at 1080p especially with a dual-gpu card. IMO, its why Intel, specifically SB is so widely recommended for a gaming build.
http://www.techspot.com/review/405-the-witcher-2-performance/page8.html

CPU2.png


CPU2.png

http://www.techspot.com/review/403-dirt-3-performance/page7.html
 
The facts as we know them:
1.) Nemesis makes an open challenge.
2.) Nemesis posts his scores.
3.) There are no screenshots or any attempt at validation.
4.) Ben90 responds with his result, exactly .001 second better, thereby conquering the posed challenge.

I submit that the only conclusion possible is that it is absolutely for real. There can be no other interpretation.




I think nobody here is arguing the fact that the 2500K will beat the 1100T in a lot of (or most?) benchmarks. That wasn't the point of AtenRa (I assume it was his post that motivated you to pose the challenge). From his post, what I gather is that in GPU-bound (or near-GPU-bound) scenarios, the difference between the two is extremely small (we know this is true, yes?), and he further postulates that this scenario is common enough for the usage patterns of enthusiasts.

In short, he was not saying the 1100T is superior or at the same league as the 2500K. Rather, I take from his posts that in the case of gaming, since most people are GPU-bound anyway, their difference really isn't that big as most benchmarks (that intentionally emphasize CPU difference) would indicate, thereby implying that for gamers, some benchmarks that particularly emphasize the CPU differences won't actually be indicative of real life settings.

Given that, there really was no need to pose the challenge. We know you'll win, of course, but that wasn't the point being previously made.

(I have a Thuban, but I won't even try to participate, because I know the 2500K is way out of my processor's league here)



I havent any way to post scores . I only used demoe of the 3Dmar6 .

I suppose I could do Cineabench x32 x64 SiperPi Screenshots . I was just checking the setup out . Watch load temps and voltages . I pretty happy with things as they are for daily use.
 
The per core performance of the cpu can effect some games at 1080p especially with a dual-gpu card. IMO, its why Intel, specifically SB is so widely recommended for a gaming build.
http://www.techspot.com/review/405-the-witcher-2-performance/page8.html

CPU2.png


CPU2.png

http://www.techspot.com/review/403-dirt-3-performance/page7.html

I get that and should have specified with single cards. You take away Xfire and SLi and you end up a video card that is struggling and a CPU that is wondering what is taking the GPU so long.

I just can't ever get to the point were I am suggesting to someone to get a single $600+ Vid card or 2 $350-$400 cards.

Then again we could be one 7k or 600 series card away from needing faster CPU's. But even then its few and far in between and to a degree you wonder how much of that is because of poor coding on the part of a game that needs it. BF3 looks like it needs it, but outside that I am not seeing an engine look even 75% as good a the one in Crysis.
 
I agree there are many games where cpu power is not a factor with single gpu, but there are examples.
http://www.techspot.com/review/305-starcraft2-performance/page13.html
CPU.png

StarCraft II only takes advantage of dual cores, but still processing power plays a major role in this game. For example, the Core i3 540 processor only has half the L3 cache of the Core i5 750 and this makes the latter 27% faster when comparing the clock for clock data at 3.70GHz.
The extra threads of the Core i7 920 processor are no advantage when compared to the Core i5 750 in this game, but the additional memory capacity and bandwidth is. The Core i7 920 was 11% faster when comparing the clock for clock data at 3.70GHz which is quite significant.
An older processor like the Core 2 Quad Q6600 suffers compared to the other CPUs tested, serving as a bottleneck to a high-end GPU such as the GeForce GTX 480. The Phenom II processors delivered average performance and we saw no real difference between the Phenom II X2 (dual-core) and Phenom II X4 (quad-core processors).
It is a shame that StarCraft II can only utilize two cores, as this really hurts older quad-core processors such as the Core 2 Quad Q6600. Furthermore this will also mean that those with budget quad-core processors, such as the Athlon II X4, will also suffer.
 
RTS. Which goes back to the original discussion of more then enough. At that point it becomes primarily unit measurement and unless it dips below about 10 frames per second its actual affect on play, or really even the eyes, is almost minimal.
 
Status
Not open for further replies.
Back
Top