• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

ITT: We discuss future pricing and availability for AM3+ processors and mobos

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

FX2000

Member
Jul 23, 2014
67
0
0
Actual performance says otherwise. Or at least AMD's implementation of CMT isn't as good as it could be.
AMD tried to be VERY innovative with their approach. Of course, they failed.
CMT is in all ways, way better than SMT. But AMD jacked FX up. They need to either fix CMT, or drop it.
But calling a 4 module, 8 core cpu, a "fake quad core" is idiotic. And has no place on a CPU forum.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Actual performance says otherwise. Or at least AMD's implementation of CMT isn't as good as it could be.

As good as it s currently with Kaveri would be more accurate, so they have already reached the point of very good CMT scaling, it s already 11 months old news...

Scaling from one to two threads within a single module is 1 to 1.88-1.89, to compare with the numbers you posted for the FX8 wich is about 1 to 1.62.

Contrary to many people here i dont think that the CMT approach is bad, quite the contrary, it s quite good.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
But calling a 4 module, 8 core cpu, a "fake quad core" is idiotic. And has no place on a CPU forum.

You are 100% right, these are actual cores, each wrongly called integer core is a complete core since it manage the FP threads as well, these are not integer cores, theses are integer and FP cores.
 

FX2000

Member
Jul 23, 2014
67
0
0
https://en.wikipedia.org/wiki/Bulldozer_(microarchitecture)#CMT


  • AMD has re-introduced the "Clustered Integer Core" micro-architecture, an architecture developed by DEC in 1996 with the RISC microprocessor Alpha 21264. This technology is informally called CMT (Clustered Multi-Thread) and formally called "module" by AMD. In terms of hardware complexity and functionality, this "module" is equal to a dual-core processor in its integer power, and to a single-core processor in its floating-point power: for each two integer cores, there is one floating-point core. The floating-point cores are similar to a single core processor that has the SMT ability, which can create a dual-thread processor but with the power of one (each thread shares the resources of the module with the other thread) in terms of floating point performance.
    • A "module" consists of a coupling of two "conventional" x86 out-of-order processing cores. The processing core shares the early pipeline stages (e.g. L1i, fetch, decode), the FPUs, and the L2 cache with the rest of the "module".
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
When Intel releases a 10 core or 12 core processor, they're called "Xeons" and go into datacenters and such. Performance/watt matters a lot in high density situations in a way that the Bulldozer/Piledriver Opterons can't match.

At the time of the P4 based xeons perfs/watt didnt matter because a lot of corporates did choose, under Intel influence, to ignore this metric, now that Intel has better perf/watt it s a gospel, you can be sure that if ever AMD gets a better perf/watt ratio in future products this metric will be rendered irrelevant by the ton of usual viral marketers that hangs on about all forums of importance.

That said, numbers wise, Haswell has lower perf/watt than Ivibridge, not that Opteron could get the crown but at low frequencies the difference is quite reduced, what AMD lacks in this domain is Steamroller cores and a slightly better process, i say slightly because the relevant parameter is supply voltage, this is the biggest and almost single advantage that Intel hold in comparison, their better perf/W ratio is uniquely due to process and not uarch contrary to the general opinion, see no other reason for Intel s eagerness to get the next nodes as soon as possible, that s not a matter of reducing cost but to keep the only sizeable advantage they have in respect of their competitor.
 

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
I remember being away from PC stuff for awhile, years, getting back on the forums and everyone is talking about watts and stuff. Took me awhile to figure out I wasn't misreading it and people were actually concerned with power usage...
Still makes me chuckle.
 

FX2000

Member
Jul 23, 2014
67
0
0
Haha, yeah. "but FX will use 20 watt more than i5 very bad in 20 years i could have gotten that i5 for free compared to the power i'd waste with the FX ur guyse retarded21111!1!!1"
 

Maxima1

Diamond Member
Jan 15, 2013
3,549
761
146
Haha, yeah. "but FX will use 20 watt more than i5 very bad in 20 years i could have gotten that i5 for free compared to the power i'd waste with the FX ur guyse retarded21111!1!!1"

That's the funny part about it. I see people quibble over $30 or so dollars between CPU's, and then they settle for the AMD, so the joke's on them. What's more, you can sell a 2500K for basically its release price even today, so the whole thing is even more hilarious.

Oh, and if you're arguing that there's really no difference in power usage, then that just bolsters the argument that you aren't using its multithreading performance constantly. Who cares if 7zip goes 8 seconds faster or whatever. An SSD alone solves most of everyday use annoyances.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Oh, and if you're arguing that there's really no difference in power usage, then that just bolsters the argument that you aren't using its multithreading performance constantly. Who cares if 7zip goes 8 seconds faster or whatever. An SSD alone solves most of everyday use annoyances.

I can make such ad hoc exemple at will, what about if i have huge files to compress..??.

But then power usage would be higher..??.

You really did choose the worst exemple because in 7zip, and winrar, the FX8370E has better perf/watt than a 4670K even at the plateform level and even when using the most power hungry AM3+ board dedicated for rabid overclockers...
 

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
The day I have to start trying to calculate performance per effing watt of electricity to base my CPU purchase is the day I throw away my keyboard and go live in a cave.
I live eight miles from a nuclear reactor for cripes sake.
I get it if it's for a company wide tons of seats deal, but I quit doing that stuff for a reason.


The price thing is harder. $20 is insignificant to an end user right?
Is $30? $50? $10? Where do you draw the line? I have had a lot of times where
ten bucks mattered when I was younger (and married, and had a kid lol).
It's more significant to the seller, since that $10 or whatever over a qty starts to add up.

I can tell you when I started with a clean slate looking at AMD vs Intel a year or so ago, my "budget" I decided I should spend on a CPU was two hundred bucks or so, right at what the 2500k and the 8350 was at the time as I recall. The motherboards were what killed me, not only did Intel have what seemed like 47 different sockets and cpu's, but the number of boards out across those different sockets and chipsets was depressing. To an OCD guy like myself, it was far, far easier to sort out and buy AMD. If I'd seen any evidence of dramatic performance differences, I might have gone a different route. But I didn't, and I still don't. I just built another 9590/990fx box for work, and my former 955be setup was a serious upgrade for some old ass Intel stuff the boss was running.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
How is it any less of a 8 core than say, an 5960X?
It's just 4 modules, with 2 cores in them. Sharing ALU's. Educate yourself and stop this garbage talk x). CMT>SMT

A 5960X has 8 real actual cores. Hyper threading aside, they do not share bits and pieces.
 

jhu

Lifer
Oct 10, 1999
11,918
9
81
At the time of the P4 based xeons perfs/watt didnt matter because a lot of corporates did choose, under Intel influence, to ignore this metric, now that Intel has better perf/watt it s a gospel, you can be sure that if ever AMD gets a better perf/watt ratio in future products this metric will be rendered irrelevant by the ton of usual viral marketers that hangs on about all forums of importance.

You're forgetting that AMD was making significant gains with Opteron despite Intel's shenanigans up until Core 2. Their decline just further accelerated with Nehalem.

That said, numbers wise, Haswell has lower perf/watt than Ivibridge, not that Opteron could get the crown but at low frequencies the difference is quite reduced, what AMD lacks in this domain is Steamroller cores and a slightly better process, i say slightly because the relevant parameter is supply voltage, this is the biggest and almost single advantage that Intel hold in comparison, their better perf/W ratio is uniquely due to process and not uarch contrary to the general opinion, see no other reason for Intel s eagerness to get the next nodes as soon as possible, that s not a matter of reducing cost but to keep the only sizeable advantage they have in respect of their competitor.

The microarch is designed for the process, so microarch does contribute significantly to performance/W. Why else would Apple's A7 rival Ivy Bridge in perf/W while on an inferior node?
 
Last edited:

Maxima1

Diamond Member
Jan 15, 2013
3,549
761
146
I can make such ad hoc exemple at will, what about if i have huge files to compress..??.
You can multitask, do it before bed, etc. It's not like a game where you have to sit there waiting for a loading screen to go away or for a program to start up.

But then power usage would be higher..??.

You really did choose the worst exemple because in 7zip, and winrar, the FX8370E has better perf/watt than a 4670K even at the plateform level and even when using the most power hungry AM3+ board dedicated for rabid overclockers...
The point of the 7zip example was that 8 seconds or whatever for each use wouldn't be much of a inconvenience to someone over the lifetime of the CPU. My post implied that I don't believe you would be using that program constantly to give you such annoyances. For virtually everyone, an SSD will go waaaaay farther.

There are plenty of applications that will use extra cores but with the Intel coming out on top. Gaming is the big one. Additionally, if you're constantly using your computer, your idle or less than load power usage will undoubtedly be higher.
 

FX2000

Member
Jul 23, 2014
67
0
0
That's the funny part about it. I see people quibble over $30 or so dollars between CPU's, and then they settle for the AMD, so the joke's on them. What's more, you can sell a 2500K for basically its release price even today, so the whole thing is even more hilarious.

Oh, and if you're arguing that there's really no difference in power usage, then that just bolsters the argument that you aren't using its multithreading performance constantly. Who cares if 7zip goes 8 seconds faster or whatever. An SSD alone solves most of everyday use annoyances.
Oh sorry.
I don't play games that don't use my FX6350. Sure, I could have bought a Haswell i5. Why would I? So I could play ARMA3 with 40 fps more? Nah bro. Intel fanboys are the worst.
https://www.komplett.dk/amd-fx-6300-black-edition/766049
https://www.komplett.dk/asus-m5a97-r20-socket-am3/759848

https://www.komplett.dk/intel-core-i5-4460/815204
https://www.komplett.dk/msi-b85m-e45-socket-1150/794115

Basicly 100$ more for the weakest i5, with a crappy motherboard to boot.
Yeah, enjoy your 5$ electricity bill savings each year, and your locked multiplier :).
 

FX2000

Member
Jul 23, 2014
67
0
0
A 5960X has 8 real actual cores. Hyper threading aside, they do not share bits and pieces.
Explain how Modules are not real cores. They share resources, yes. But are very much real cores, as in all other Intel & AMD products. Stop spewing lies.
 
Aug 11, 2008
10,451
642
126
What difference does it make whether they are real "cores" or not? It is like arguing how many angels can sit on the head of a pin. What counts is performance. The fact is, in gaming 4 intel cores in almost every instance is faster than 8 AMD cores. In many games, even 2 intel cores are faster than 8 AMD cores. As far as power usage goes, I am willing to use more power for better performance, but it seems like the worst of both worlds to use more power for lower performance.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
You're forgetting that AMD was making significant gains with Opteron despite Intel's shenanigans up until Core 2. Their decline just further accelerated with Nehalem.

That s right but they didnt break the 50% marketshare despite a perf/watt discretanpcy that was quite significant, with the same, or even worse, product competitively speaking than AMD current line Intel managed to have the biggest part of this market, we are far from the current situation where AMD got trashed even when they had better perf/Watt than the competition with the first Phenoms.

The microarch is designed for the process, so microarch does contribute significantly to performance/W. Why else would Apple's A7 rival Ivy Bridge in perf/W while on an inferior node?

If you shrink the A7 to a smaller node it will rival said IB even better, so the process has a lot to do with it unless the design is really flawed, so far AMD has a 32nm process that is good to reach high frequencies and is better than Intel s at this level but it is somewhat lacking at the other end of the frequency spectrum, i wont elaborate but suffice to say that they have at least 15% supply voltage penalty wich will induce 32% more power losses, the losses due to parasistic capacitances are compensated by the SOI process so it s likely that from this POV they have not a significant disadvantage but supply voltage is more concerning since it scale as square law while capacitance will increase power linearly.

Now you can extrapolate what a 8C Kaveri would yield on a 22nm shrinked process, it would be better than a HW i7 perf/Watt wise.

Explain how Modules are not real cores. They share resources, yes. But are very much real cores, as in all other Intel & AMD products. Stop spewing lies.

Dont pay attention to not argumented posts, i did read here that the FX has two integer cores and one FP core within a module, wich is complete nonsense, some people are confusing FP exe units with a full core, they dont even realize that the FPU unit is managed by the two parent cores.
 
Last edited:

FX2000

Member
Jul 23, 2014
67
0
0
What difference does it make whether they are real "cores" or not? It is like arguing how many angels can sit on the head of a pin. What counts is performance. The fact is, in gaming 4 intel cores in almost every instance is faster than 8 AMD cores. In many games, even 2 intel cores are faster than 8 AMD cores. As far as power usage goes, I am willing to use more power for better performance, but it seems like the worst of both worlds to use more power for lower performance.
No really.
Why are intel faster?
FX was a disaster for AMD, it was inteded for the workstation market. It never delivered. People make it out to FX being a huge, steaming pile of shit.
Truth to be told, so far, no game I have (140~ games on steam), have been played at a dissatisfactory framerate. If you take an Intel cpu, good for you. But taking an intel i3, as many people suggest, over a FX6300/FX8320, is, just stupid.
I'll never understand people who try to shave 20$ a year off their bills, by buying another CPU, instead of just buying better lightbulbs etc.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
What difference does it make whether they are real "cores" or not? It is like arguing how many angels can sit on the head of a pin. What counts is performance. The fact is, in gaming 4 intel cores in almost every instance is faster than 8 AMD cores. In many games, even 2 intel cores are faster than 8 AMD cores. As far as power usage goes, I am willing to use more power for better performance, but it seems like the worst of both worlds to use more power for lower performance.

When i read post like this i m inclined to believe that we are in a console dedicated forum, i mean, is gaming the only usage of a PC, or is it the only usage that could be used to eventualy support your constant bashing of AMD s CPUs.?

I pointed a ton of applications where the FXs are better than the i3 and i5, but like shintaidk in another thread (or was it this one.?) who is constantly using games, that is, one application, as exemple, it looks like you know what are the stregnth of AMD CPUs, hence this game s religion..

Look here and see where your i5 2500K ended in respect of its opponent (the FX8150) that was branded barely better when released :

http://forums.anandtech.com/showpost.php?p=37029936&postcount=79

Check Computerbase.de link, they have a suite of benches, and of course of games...


I remember being away from PC stuff for awhile, years, getting back on the forums and everyone is talking about watts and stuff. Took me awhile to figure out I wasn't misreading it and people were actually concerned with power usage...
Still makes me chuckle.

I had a core2 duo that was pumping 70W (with an IGP...) while doing nothing, i guess that it s the first time that i post about its power comsumption, if AMD was to gain the perf/Watt crown you can be sure that the metric will shift to the CPU boxes being more attractive with Intel..
 
Last edited:

FX2000

Member
Jul 23, 2014
67
0
0
I'm rather glad I have an FX6300, rather than some puny i3, when I have to compress 90GB+ of anything. Or decompress ISO's etc.
Heavly compressing stuff, that takes some time I'm saying.
 

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
People make it out to FX being a huge, steaming pile of shit.
Truth to be told, so far, no game I have (140~ games on steam), have been played at a dissatisfactory framerate.

ditto
I often throw out "I'm looking for an excuse to upgrade" and nobody has bit yet.


Folks frequently say Intel is faster, I don't dispute that in a lot or even most cases, but I also never see a list saying ok you can play game/app 1, 2, 3, and 4 at 1080p max setting or whatever, but not games 5, 6, 7 and 8. I understand GPU is a lot of it and it varies with the way a game engine implements things or the way an application is written, but I'm still not seeing anything but the same benchmarks quoted over and over. And they all still tell me a mid or high end FX ain't a bad deal for the money and it's age. "Faster" seems to be mighty relative these days.
 

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
I'm rather glad I have an FX6300, rather than some puny i3, when I have to compress 90GB+ of anything. Or decompress ISO's etc.
Heavly compressing stuff, that takes some time I'm saying.

I thought every modern CPU could compress huge files in the background and not interrupt normal usage at this point? I know mine can. I can throw "half" the cpu at winrar or whatever and never notice it's compressing for an hour in the background. I've gamed while it was compressing lol.
 

FX2000

Member
Jul 23, 2014
67
0
0
I thought every modern CPU could compress huge files in the background and not interrupt normal usage at this point? I know mine can. I can throw "half" the cpu at winrar or whatever and never notice it's compressing for an hour in the background. I've gamed while it was compressing lol.
Yeah of course, I could throw 1 core at a 90GB compress, but it'd need to run for hours ;).
 

Maxima1

Diamond Member
Jan 15, 2013
3,549
761
146
People make it out to FX being a huge, steaming pile of shit.
Truth to be told, so far, no game I have (140~ games on steam), have been played at a dissatisfactory framerate.
I agree that it's not just a POS, but I don't think anyone is getting a great deal compared to what Intel offers. As I've said though, on the desktop, it's not that detrimental these days to have lower single threaded performance. Newer games are using more cores and older games can easily be played.

When i read post like this i m inclined to believe that we are in a console dedicated forum, i mean, is gaming the only usage of a PC, or is it the only usage that could be used to eventualy support your constant bashing of AMD s CPUs.?
Because many people who aren't into gaming won't give a crap about their computer specs as much. If you do and are not into PC games, you're more like those wildersecurity forum folks.
 
Last edited: