Core i7 wake up call: AMD Phenom II X6 1090T BE overclocked to 6.29GHz!!

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BenchZowner

Senior member
Dec 9, 2006
380
0
0
Intel's current plans for another 6-core processor for the LGA1366 platform in 2010 is the i7 970 gufltown ( MSRP should be around 550$ )
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
Well, you cant be serious into making us believe that Intel will do such a thing,are you? or is that wishful thinking, selling a piece of blue sky?

Hey there's nothing wrong with dreaming, right?
It's happened in the past. Intel loves to cut prices - it's their way of pouring salt into AMD's wounds. :eek:
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Take a look here:

http://www.anandtech.com/show/2901/13

http://www.anandtech.com/show/2901/12

According to Anandtech, the x4 965BE beats every Clarkdale in all but two gaming benchmarks (Dawn of War II, World of Warcraft), and that's at stock. Given how much i3s thrash past around 4-4.5ghz due to poor scaling, I'd take the OCed 965 any day of the week for games.

You can't really extrapolate any performance numbers for a 920 from Clarkdale anyway . . . different memory controllers, different QPI implementation, etc.

Phenom x4 965be is clocked @ 3.4ghz while i3 is clocked @ 2.93 ghz. If you want to compare at least compare at same clock levels or even same price range. At least that's what I've been told by guys in this thread who favor AMD. It's hypocritical. :hmm:
 

Accord99

Platinum Member
Jul 2, 2001
2,259
172
106
Same situation with a 6 core AMD vs a 4 core Intel. Intel is faster clock for clock, but not 50% faster clock for clock, which is what they would have to be for the quad core Intel to beat it in a thread intensive test.
Actually, a single Nehalem core with HT does have close to 50% more throughput than a single Thuban core at the same clockspeed for well multi-threaded applications.

For example Cinebench 11.5 where the i920 is pretty close to the 1055T
http://diy.pconline.com.cn/cpu/reviews/1004/2096865_5.html


I really want to see gaming benchmarks with this processor, when comparing to an i7 930. Clock for clock gaming benchmarks at 4.0ghz for the 1055T and the 1090T. I also would like to see a general CPU performance comparison between the two.
It's in Chinese but I think it's the first (p)review that compares a Thuban with other AMD and Intel processors in a variety of apps:

http://diy.pconline.com.cn/cpu/reviews/1004/2096865_4.html
 
Last edited:

Apocalypse23

Golden Member
Jul 14, 2003
1,467
1
0
Another question I pose is why is the 1090T posted as
AMD Phenom II X6 1090T 3.2GHz/3.6GHz Turbo Core 9MB Cache - Six Core Processor - Socket AM3 (45nm) ??

I don't get the 3.6Ghz
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
Another question I pose is why is the 1090T posted as
AMD Phenom II X6 1090T 3.2GHz/3.6GHz Turbo Core 9MB Cache - Six Core Processor - Socket AM3 (45nm) ??

I don't get the 3.6Ghz


3.6 is the clock speed of the turbo mode when only 3 cores are active.
 

Apocalypse23

Golden Member
Jul 14, 2003
1,467
1
0
The Turbo speed when 3 or more cores are idle?

3.6 is the clock speed of the turbo mode when only 3 cores are active.

Ah, gotcha, thanks.

The one noticeable improvement that stands out in numbers with the phenom X6 1055/1090T is the 9MB cache, also the 125TDP as opposed to Intel's 130TDP. These new x6's should perform faster clock for clock than a phenom x4 for sure.
 
Last edited:

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
Ah, gotcha, thanks.

The one noticeable improvement that stands out in numbers with the phenom X6 1055/1090T is the 9MB cache, also the 125TDP as opposed to Intel's 130TDP. These new x6's should perform faster clock for clock than a phenom x4 for sure.

9 MB is not actually an improvement L2+L3=9MB

L3 cache is same at 6MB.

L2 is 6 x 512 kB = 3MB
 
Last edited:

Apocalypse23

Golden Member
Jul 14, 2003
1,467
1
0
9 MB is not actually an improvement L2+L3=9MB

L3 cache is same at 6MB.

L2 is 6 x 512 kB = 3MB

I see, I completely forgot that owners of a 6 core phenom will be getting higher 3d Mark Vantage scores as opposed to a i7 930, am I correct?
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
I see, I completely forgot that owners of a 6 core phenom will be getting higher 3d Mark Vantage scores as opposed to a i7 930, am I correct?

I am in no way equipped to comment other than few benches which I take it as a grain of salt.

This is what I know, I personally won't care about those scores. The architecture doesn't change so i don't see a great improvement in performance clock per clock.

The major change is in process technology which makes these new chips consume less power which make them better at being efficient(performance/watt) not better on clock per clock basis.

AFAIK, I will buy one by the end of this year :D
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
According to Anandtech, the x4 965BE beats every Clarkdale in all but two gaming benchmarks (Dawn of War II, World of Warcraft), and that's at stock.

No it doesn't. You are only comparing average framerates. When comparing high end processors, the focus should be on minimum framerates, not just averages, because that's where the architectural differences are most prevalent in contributing to a smooth gaming experience. If you just look at average framerates, then you are more GPU limited since more or less every modern CPU can provide sufficient averages.
 
Last edited:

BD231

Lifer
Feb 26, 2001
10,568
138
106
No it doesn't. You are only comparing average framerates. When comparing high end processors, the focus should be on minimum framerates, not just averages, because that's where the architectural differences are most prevalent in contributing to a smooth gaming experience. If you just look at average framerates, then you are more GPU limited since more or less every modern CPU can provide sufficient averages.

Yeah lets all focus on those intangible gains, that's what counts here.
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
No it doesn't. You are only comparing average framerates. When comparing high end processors, the focus should be on minimum framerates, not just averages, because that's where the architectural differences are most prevalent in contributing to a smooth gaming experience. If you just look at average framerates, then you are more GPU limited since more or less every modern CPU can provide sufficient averages.

deja vu all over again.
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
No it doesn't. You are only comparing average framerates. When comparing high end processors, the focus should be on minimum framerates, not just averages, because that's where the architectural differences are most prevalent in contributing to a smooth gaming experience. If you just look at average framerates, then you are more GPU limited since more or less every modern CPU can provide sufficient averages.

Umm, yeah. I have an old game that always reports the minimum frame rate of 0 fps because it starts measuring immediately after the game starts up. (IE while things are still loading, the game is Black and White for the inquisitive). Does that mean we have had no progress is hardware since the release of that game?

Your observation would be correct if we could always say that the CPU is solely responsible for the minimum frame rate. However, that just isn't true. There are just as many factors to minimum frame rate as there are to average and maximum frame rate.

If you only change one components, or at very least keep component changes to a minimum, then average framerate for multiple systems is a pretty good indicator of the component's speed compared the other tested components. It is only when you make dramatic changes that things start to get hairy.

In other words, Ideal testing keeps the GPU, ram, OS, and if possible motherboard, constant while changing the CPUs out. And since there are, sometimes significant, differences between average speeds, we can safely conclude that CPU speed does in fact affect the average.
 

BenchZowner

Senior member
Dec 9, 2006
380
0
0
Umm, yeah. I have an old game that always reports the minimum frame rate of 0 fps because it starts measuring immediately after the game starts up. (IE while things are still loading, the game is Black and White for the inquisitive). Does that mean we have had no progress is hardware since the release of that game?

Your observation would be correct if we could always say that the CPU is solely responsible for the minimum frame rate. However, that just isn't true. There are just as many factors to minimum frame rate as there are to average and maximum frame rate.

If you only change one components, or at very least keep component changes to a minimum, then average framerate for multiple systems is a pretty good indicator of the component's speed compared the other tested components. It is only when you make dramatic changes that things start to get hairy.

In other words, Ideal testing keeps the GPU, ram, OS, and if possible motherboard, constant while changing the CPUs out. And since there are, sometimes significant, differences between average speeds, we can safely conclude that CPU speed does in fact affect the average.

Uhm... real-life gaming tests with various CPUs/frequency ( click here )

Before you say those are just 4 games, I have other measurements from the past and also some more for a forthcoming review.
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
Uhm... real-life gaming tests with various CPUs/frequency ( click here )

Before you say those are just 4 games, I have other measurements from the past and also some more for a forthcoming review.

http://www.anandtech.com/show/2972/...-pentium-g6950-core-i5-650-660-670-reviewed/7

Ahem.. Real life games with average FPS...

http://www.guru3d.com/article/core-i5-650-660-661-review-test/17

And another if you don't believe me.

Heck, just about every trusted reviewer out there uses average FPS as their measurement of choice. Lower resolutions especially show CPU disparity. Not minimum FPS.

Minimum FPS is a particularly bad measurement because any number of system anomalies can happen which would cause the benchmark to report a lower then expect FPS value. It is 1 measurement verses 1000s.

The test you posted was invalid because the review chose maximum resolutions and graphical settings. That taxes the video card, forcing the CPU to wait on the video card. To properly review the cpu's capabilities, you want it the other way around. Minimum FPS doesn't do anything to solve that problem. All it shows is that most CPUs are fast enough to feed graphics overtaxed graphics cards.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
http://www.anandtech.com/show/2972/...-pentium-g6950-core-i5-650-660-670-reviewed/7

Ahem.. Real life games with average FPS...

http://www.guru3d.com/article/core-i5-650-660-661-review-test/17

And another if you don't believe me.

Heck, just about every trusted reviewer out there uses average FPS as their measurement of choice. Lower resolutions especially show CPU disparity. Not minimum FPS.

Minimum FPS is a particularly bad measurement because any number of system anomalies can happen which would cause the benchmark to report a lower then expect FPS value. It is 1 measurement verses 1000s.

The test you posted was invalid because the review chose maximum resolutions and graphical settings. That taxes the video card, forcing the CPU to wait on the video card. To properly review the cpu's capabilities, you want it the other way around. Minimum FPS doesn't do anything to solve that problem. All it shows is that most CPUs are fast enough to feed graphics overtaxed graphics cards.

well there needs to be a graph to show just how long or how many times those minimums were reached. even with a wimpy 4670 I had better playability in many games with my E8500 as opposed to using my 5000 X2. for instance in Warhead there were a few spots that just tanked the framerate in the low teens when using the 5000 X2 while never going below mid twenties with the E8500. now just doing an average framerate would not have made it apparent even though it was quite clear when actually playing the game.
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
well there needs to be a graph to show just how long or how many times those minimums were reached. even with a wimpy 4670 I had better playability in many games with my E8500 as opposed to using my 5000 X2. for instance in Warhead there were a few spots that just tanked the framerate in the low teens when using the 5000 X2 while never going below mid twenties with the E8500. now just doing an average framerate would not have made it apparent even though it was quite clear when actually playing the game.

Anand actual did that in the past, he posted a line graph showing the FPS throughout gameplay.

As for the E8500 vs 5000 X2.. Um yeah, those aren't in the same league. The E8500 is significantly faster then the 5000 X2.
 

BenchZowner

Senior member
Dec 9, 2006
380
0
0

I was pretty sure that you'd try to dodge this, but unfortunately for you I don't like dodgeball.

Heck, just about every trusted reviewer out there uses average FPS as their measurement of choice. Lower resolutions especially show CPU disparity. Not minimum FPS.

Lower resolutions with low/medium graphics settings move most of the stress to the CPU since the GPU tasks are pretty easy for a modern GPU to process, thus you'll see the minimum/average/maximum frame-rates scale with better CPU architectures, higher L2/L3 cache, higher frequencies, etc.

Those tests when it comes to reality, real-life are as worthy as dust inside your computer case.

Unless you like spending 400$ on high end VGAs and run your games at low resolutions along with low or medium graphics details and no AA/AF.

Seriously, who games at those settings ? ( except the unlucky people who can't afford a decent graphics card )

Minimum FPS is a particularly bad measurement because any number of system anomalies can happen which would cause the benchmark to report a lower then expect FPS value. It is 1 measurement verses 1000s.

That's why you make sure the testing environment is kept as stable as possible, and equip the testing rig with the best/appropriate components and OS setup & configuration to make sure you don't have any anomalies when running a benchmark.

And of course you run the benchmark at least 3 times and average the frame rates, check for any inconsistencies or anomalies, and if you see any anomalies check the FPS graphs to see if it's a "glitch" and re-run / exclude that minimum rate.

The test you posted was invalid because the review chose maximum resolutions and graphical settings.

It's invalid for who ?
For the gamer ? Effin no!
"Synthetic" or theoretical system power as we can call it ( running games at very low resolutions and graphics details ) doesn't matter for a gamer because a gamer doesn't run his games at that resolution and especially with low details and no Anti-Aliasing & Anisotropic Filtering ( insert duh emoticon here! :D )

The tests that really matter are the ones run with settings we normally use in gaming.
And at those settings, the differences are the ones you can see in the charts of the linked page.
From 3.6GHz or so and higher the processors aren't bottlenecking any modern GPU, and hence there are no differences.

To properly review the cpu's capabilities, you want it the other way around.

Number-wise surely that kind of measurements look better, for the manufacturers, the shops and some people who aren't aware of what these measurements mean and what they'd get in real-life gaming.
And since you were so quick at invalidating my review because you didn't like the real-life gaming tests, you can go to the previous page, and check the nonsense numbers from CPU Limited resolutions/settings with nice blingy charts :D

CPU Limited Scenario - Gamers look away page! ( click click! )
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
I was pretty sure that you'd try to dodge this, but unfortunately for you I don't like dodgeball.
Dodge what? Your review of a system at maximum settings trying to be used as a CPU benchmark, which you try and use as proof that minimum FPS should be the be all, end all, benchmark tool for gamers? Heck no, it is unreliable, run those bench marks a second time and I can almost guarantee you'll get different results.


Lower resolutions with low/medium graphics settings move most of the stress to the CPU since the GPU tasks are pretty easy for a modern GPU to process, thus you'll see the minimum/average/maximum frame-rates scale with better CPU architectures, higher L2/L3 cache, higher frequencies, etc.

Those tests when it comes to reality, real-life are as worthy as dust inside your computer case.

There's where you are wrong. Those test are extremely important because that is precisely the test that will show how your CPU is going to perform in CPU intensive scenes. It is a far better indicator of where a cpu will gain and lag then the Minimum FPS could hope to display.

Unless you like spending 400$ on high end VGAs and run your games at low resolutions along with low or medium graphics details and no AA/AF.

Seriously, who games at those settings ? ( except the unlucky people who can't afford a decent graphics card )
The settings aren't the point of the test, the point of the test is to see which CPU performs better. Those settings more then anything indicate the overall performance of a CPU for games.


That's why you make sure the testing environment is kept as stable as possible, and equip the testing rig with the best/appropriate components and OS setup & configuration to make sure you don't have any anomalies when running a benchmark.

Alert me when you learn how to control the OS scheduling, CPU errata, driver errata, bios errata, HD errata, the PRNG for the game, ect. You can't make the system as stable as possible, there are too many uncontrollable variables that a reviewer has no control over. Those are mitigated by taking the average, but can play some nasty tricks when taking single measurements.

And of course you run the benchmark at least 3 times and average the frame rates, check for any inconsistencies or anomalies, and if you see any anomalies check the FPS graphs to see if it's a "glitch" and re-run / exclude that minimum rate.
For testing a CPU, no, those tests are worthless. If your GPU is bottle necking performance then the benchmark is worthless for CPU speed. It is quite conceivable to see a slower CPU perform better then a faster CPU in those types of benchmarks (Which, BTW, happens in a couple of those benchmarks you posted earlier).

As for the glitching. what if the glitch happens 3 times in a row? What then? It is quite possible for that to happen. Do you just keep testing till you get the results you want? And what about reproducing the results? Any good test should be fairly easy to reproduce. Minimum FPS quite often is not on the first shot.

It's invalid for who ?
For the gamer ? Effin no!
"Synthetic" or theoretical system power as we can call it ( running games at very low resolutions and graphics details ) doesn't matter for a gamer because a gamer doesn't run his games at that resolution and especially with low details and no Anti-Aliasing & Anisotropic Filtering ( insert duh emoticon here! :D )
You have no freaking clue what Synthetic means, do you. Synthetic benchmarks are those that run a bunch a pre-programmed functions for non-real applications, such as sandra. They are programs that don't particularly do anything useful except for benchmarking (which is questionable as who is to say which instruction or function is more valuable then another). Games, in any setting, are NOT synthetic as they are real products that do real things besides just benchmarking.

The tests that really matter are the ones run with settings we normally use in gaming.
And at those settings, the differences are the ones you can see in the charts of the linked page.
From 3.6GHz or so and higher the processors aren't bottlenecking any modern GPU, and hence there are no differences.
What are you measuring? If your tests are measuring GPU performance, then I agree with you, those tests are perfect for that. However, for measuring CPU performance, the only thing that slew of tests proves is that the CPU isn't causing a bottleneck for those games.

In situations where it does bottle neck (IE, when FPS dips) then a measure of CPU speed for that game using average FPS and lower resolutions will be a MUCH better indicator of how the CPU will perform in droop periods then looking at the minimum FPS in general.


Number-wise surely that kind of measurements look better, for the manufacturers, the shops and some people who aren't aware of what these measurements mean and what they'd get in real-life gaming.
And since you were so quick at invalidating my review because you didn't like the real-life gaming tests, you can go to the previous page, and check the nonsense numbers from CPU Limited resolutions/settings with nice blingy charts :D

CPU Limited Scenario - Gamers look away page! ( click click! )
[/quote]
Yes, because I always read through every page of a 20 pg article posted on a website i've never heard of to argue a point that I don't agree with...

Let me put it this way, Would you judge the speed of a cpu by how fast you could copy a file from one hard drive to another? No, that would be ridiculous. Yet by using max graphical settings and looking at minimum FPS, that is precisely what you are doing. Trying to measure the speed of a component when another component is causing the main slowdown.

Using lower graphical settings that stress the CPU more and looking at average FPS will be a much better indication of how a CPU will perform in droops. End of story. Because a faster CPU will perform better on average then a lower CPU, there is little error because there are 1000s of measurements. Looking at the minimum FPS and picking the numbers you like best from three tests is bad and extremely easy to bias.
 
Last edited:

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Man at this at this rate Dr. Who will show up for intel JD for AMD and the biggest BS spreader of them all Informal. The marketing hype is high on this one. AMD is selling dirt cheap and still losing market share to intel.

This is actually getting good . More popcorn please.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
let me get things clear here. I said Thuban might challenge lower end i7. What is wrong with that????

If that pisses people off than you're a FANBOY period, because i only said that AMD could challenge (never said it would beat i7)

Question: How the hell do you know it won't challenge a i7 920, are you benching both now?

Doesn't matter what you said in reality I am also addressing the topic title