Drafted Box Art for 4 and 8 Core Bulldozer CPU's

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
I can't be the only one who thinks the new 'FX' is a complete joke.

Edit: Sorry, not in performance per se, but in marketing. FX used to be like EE for Intel. FX priced against a mainstream $200-300 CPU? Seems odd to me.

If, hypothetically speaking, bulldozer were coming up short against Sandy Bridge (internal testing at this time) in similar ways that Phenom came up short against Conroe then what would you do at this stage in the pre-release game?
 

Soleron

Senior member
May 10, 2009
337
0
71
I have a hard time putting any stock in these leaks of late for the simple fact that none of them entail clockspeeds of the cores.

People will leak clockspeeds before leaking "FX-8130P"...unless this is AMD intentionally leaking details to control the flow of info.

I mean come on, if you were someone who had details about bulldozer would you (a) leak the clockspeeds, or (b) leak that the SKU name was "FX-8130P"?

I just don't buy it. Any geek who was willing to risk losing his job over leaking the SKU name would just cut to the quick and let out the juice that they know the nerd world wants to know...clockspeeds.

AMD has a history of settling on model names before determining clockspeeds. Remember how the original Phenoms (up to 9600), when the model numbers were leaked months before, had the 9600 at 2.5GHz and down from there. It actually launched at 2.3 with the same SKU names.

They just haevn't determined BD's final clockspeed potential yet, but they know what product SKUs they want and how they are relatively positioned (rough price brackets internally).
 

Mopetar

Diamond Member
Jan 31, 2011
8,497
7,753
136
I can't be the only one who thinks the new 'FX' is a complete joke.

Edit: Sorry, not in performance per se, but in marketing. FX used to be like EE for Intel. FX priced against a mainstream $200-300 CPU? Seems odd to me.

Stupid consumer: "You mean I can pay $300 for something that used to cost like a $1000? Hells yeah! I'll take 3."

Makes a certain amount of sense from a marketing perspective. For the most part, I don't think most people around here care what it's called so long as it performs well.
 

Mopetar

Diamond Member
Jan 31, 2011
8,497
7,753
136
If, hypothetically speaking, bulldozer were coming up short against Sandy Bridge (internal testing at this time) in similar ways that Phenom came up short against Conroe then what would you do at this stage in the pre-release game?

Polish up ye ole resume.
 

zebrax2

Senior member
Nov 18, 2007
977
70
91
Stupid consumer: "You mean I can pay $300 for something that used to cost like a $1000? Hells yeah! I'll take 3."

Makes a certain amount of sense from a marketing perspective. For the most part, I don't think most people around here care what it's called so long as it performs well.

Most people probably don't even know that AMD sold FX processors. Plus FX doesn't really scream "performance" as much as extreme edition
 

scbjmshpv

Senior member
Mar 16, 2011
223
0
76
need little more black and little more light red (white and red combo) as for my test but none the less, all i can say is HS, last but not least some art on CPU's (as RAM,MB,Case,PSU and everything else computer main components has art in it)
 

Morg.

Senior member
Mar 18, 2011
242
0
0
From what I know AMD can do HT too, or is hyperthreading only for Intel ...if bulldozer does do HT then thats 16 logical cores for your video editing or audio editing software...

HT IS BAD.

I know Intel markets it a lot and stuff but the reality remains the same

2 half power cores > 2 HT cores
2 HT cores > 1 full power core (if you don't have enough threads)

The "enough" threads barrier has nothing to do with editing software.

You can smash your workload in 4 big threads or 8 small threads it makes no goddamn difference except you have a bit more overhead if you have a bit more threads.

Besides, HT cuts your core in half and thus you end up having virtually 2 half power cores which are NOT as efficient as two real half power cores.

HT IS POINTLESS
It was introduced because the P4 was unable to do multitasking correctly, and Intel needed a bit more than just "mad monothread performance".

So to summarize :
If your workload can be threaded at will : real cores > HT
if your workload has n threads and you have n cores : real cores > HT
if your workload has 2n threads and you have n cores : HT > real cores

Which does limit the use of said HT, and which is why AMD never considered getting such a useless feature, as they did not have multithreading issues with their Athlon CPU's.

So please, stop thinking HT is great and a feature that Intel's competitors should have because its just not that.

And now, as we're on more of a gamers forum than IT professionals, I would like to say this :
HT sucks for games, always have, always will and if I could just ask Intel for one thing, I would ask them to remove the circuitry for HT and replace it with something useful like more cache, an on-die networks card or even just blank silicon (you get better cooling if there's nothing there :) )
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
HT IS BAD.

I know Intel markets it a lot and stuff but the reality remains the same

2 half power cores > 2 HT cores
2 HT cores > 1 full power core (if you don't have enough threads)

The "enough" threads barrier has nothing to do with editing software.

You can smash your workload in 4 big threads or 8 small threads it makes no goddamn difference except you have a bit more overhead if you have a bit more threads.

Besides, HT cuts your core in half and thus you end up having virtually 2 half power cores which are NOT as efficient as two real half power cores.

HT IS POINTLESS
It was introduced because the P4 was unable to do multitasking correctly, and Intel needed a bit more than just "mad monothread performance".

So to summarize :
If your workload can be threaded at will : real cores > HT
if your workload has n threads and you have n cores : real cores > HT
if your workload has 2n threads and you have n cores : HT > real cores

Which does limit the use of said HT, and which is why AMD never considered getting such a useless feature, as they did not have multithreading issues with their Athlon CPU's.

So please, stop thinking HT is great and a feature that Intel's competitors should have because its just not that.

And now, as we're on more of a gamers forum than IT professionals, I would like to say this :
HT sucks for games, always have, always will and if I could just ask Intel for one thing, I would ask them to remove the circuitry for HT and replace it with something useful like more cache, an on-die networks card or even just blank silicon (you get better cooling if there's nothing there :) )

Unless you are giving away processors for free, and they consume zero power, your assertions and comparisons regarding the utility of hyperthreading are woefully missing the relevance of price/performance and performance/watt.

I've used hyperthreading, it gave me very real performance boosts. It works.

I've also used AMD cpus, and via/cyrix for that matter (even TI x86 chips lol)...arguments regarding the superiority of one microarchitecture over another are irrelevant at the consumer level if those arguments are being made in the absence of price, performance, and power usage.
 

podspi

Golden Member
Jan 11, 2011
1,982
102
106
I've also used AMD cpus, and via/cyrix for that matter (even TI x86 chips lol)...arguments regarding the superiority of one microarchitecture over another are irrelevant at the consumer level if those arguments are being made in the absence of price, performance, and power usage.

:eek: I didn't realize so many firms made x86 processors at one point.

http://www.x86-guide.com/en/histoire/la-genese-de-l-informatique-pre-x86.html


To think that there are now only 3.

SMT is a pretty useful technology. Really, a Bulldozer module looks less like 2 cores and more like one core with "super-hyperthreading", and as some people point out this is how the original engineers envisioned it.

I think Bulldozer is going to live up to its name. It is going to lose in latency benchmarks (Bulldozers ARE slow) but going to 'bulldoze' the competition in throughput.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I think it's closer to 3+ years since the public first heard of it. Remember it was first scheduled to be released in 2009.

I remember that Gary Key mentioned the fact that BD needed to be "awesome and be here in a hurry" in his original phenom launch article, and that was before the tlb bug came to light. I think that was in summer 2007.
 

Rezist

Senior member
Jun 20, 2009
726
0
71
We keep talking about comparing an 8 core bulldozer to a 4 core SB chip but what are we calling a core?

In AMD's world this could mean a 4 module (8 core) vs a 4 core SB chip, which from what I've read is probably a fair comparison, I mean there module sounds like HT extreme like posted above and SB has HT so fair is fair.

If it is indeed a 8 module BD chip or 16 core as they would put it then yes that's a little of an extreme comparison.
 

podspi

Golden Member
Jan 11, 2011
1,982
102
106
We keep talking about comparing an 8 core bulldozer to a 4 core SB chip but what are we calling a core?

In AMD's world this could mean a 4 module (8 core) vs a 4 core SB chip, which from what I've read is probably a fair comparison, I mean there module sounds like HT extreme like posted above and SB has HT so fair is fair.

If it is indeed a 8 module BD chip or 16 core as they would put it then yes that's a little of an extreme comparison.

All that matters is price and power usage.:hmm::D
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Box art? Oh man, this is exciting!

Maybe in 2 months we will get to see the REAL goods....the case stickers!
 
Last edited:

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,402
8,574
126

(sic)Klown12

Senior member
Nov 27, 2010
572
0
76
Intel has never been fabless, but IBM wanted to make sure that any one foundry problem wouldn't slow supplies. Therefore Intel had to make several different arrangements for manufacturing sources, one of which being AMD who won a lawsuit in order to make their own x86 processor.

What would the market look like if that court had ruled for Intel and not AMD?
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Netburst at 10ghz? :twisted:

It would only work to a certain point. Intel said in order for Prescott to reach 5GHz, it would have needed 200W to work. Then their laptop chips would be a) really slow b)totally different architecture

10GHz might work with the 32nm process, but still doesn't solve the problem in laptops. I think they would have abandoned Netburst eventually, just maybe a bit slower.
 

(sic)Klown12

Senior member
Nov 27, 2010
572
0
76
It would only work to a certain point. Intel said in order for Prescott to reach 5GHz, it would have needed 200W to work. Then their laptop chips would be a) really slow b)totally different architecture

10GHz might work with the 32nm process, but still doesn't solve the problem in laptops. I think they would have abandoned Netburst eventually, just maybe a bit slower.

I always thought that if AMD wasn't there, Intel would have left x86 behind and switched to IA-64. Isn't the only reason they didn't is because AMD came out with x86-64 which became preferred due to it's better compatibility with traditional x86 software.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
I don't think so. x86 lived the mid-1990s RISC onslaught nearly due to having legacy code compatibility. Pentium Pro, which brought the Pentium II, was mainly because they wanted to kill the RISC advance to the desktops. Not AMD, though some of the chips after that and specific SKUs are definitely a reaction to AMD.

Market forces are for the most part, logical.

Itanium would bring no advantage over the existing x86 microprocessors(even hypothetically speaking the Itanium would never have serious execution/delivery issues).

-Performance? Yea after recompiling, and bringing the chip from $2000-4000 to price needed in desktops would lower the advantage further
-Power consumption? Nope
-Price? Nope

Maybe some people(even at Intel for short periods of time) did really think IA-64 would migrate down to desktops. Whatever hope there was got completely killed when x86-64 arrived. Perhaps in 50 years? :p
 
Dec 30, 2004
12,553
2
76
I always thought that if AMD wasn't there, Intel would have left x86 behind and switched to IA-64. Isn't the only reason they didn't is because AMD came out with x86-64 which became preferred due to it's better compatibility with traditional x86 software.

that was my understanding as well.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Having been "in the thick of things" at the time of Merced's pre-launch planning, etc, I would say that there is no way you can convince me that Intel invested as much as they did in Itanium with the plan being to capture no more TAM than that presented by the big-iron markets.

I was distinctly given the understanding the plan was for EPIC to be Intel's "way out" of the limitations that come with x86, both microarchitectural and economic (they had to cross-license x86 as a matter of law, but not EPIC).

AMD saw this handwriting on the wall and did everything they could to erode the possibility of Intel (and thus the market) abandoning x86...AMD's very existence depended on them preventing Itanium from becoming successful in the desktop arena.

To that end we saw AMD escalate the timeline for 64bit insertion into x86 processors.

Once it became obvious that Itanium was not going to penetrate the desktop/consumer space the focus of Itanium changed, naturally, as did the focus of the desktop x86 lines. Itanium was redirected towards securing only the very top-end while the P4 was given 64bit and a handful of minimalistic RAS features to compete with the 4S and less workstation and mini-server space.

What we can say is that the Itanium that is in the market today was most definitely not the Itanium we would have in our desktops had Intel's original plans that motivated the funding for the creation of Merced had come to fruition. The Itanium of today is intentionally tuned/engineered to serve the big-iron market.

So we can't really look at it and say "I am so glad AMD saved us from that!" because "that" is not what we'd have had AMD not initiated their drastic maneuvers to secure their future in selling x86-based CPU's by ensuring the market itself would continue to exist.