Fudzilla: Bulldozer performance figures are in

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

pantsaregood

Senior member
Feb 13, 2011
993
37
91
Can we stop talking about how Ivy Bridge is going to steamroll Bulldozer?

If Bulldozer is competitive with Sandy Bridge, then it will also be competitive with Ivy Bridge. Ivy Bridge is, for all intents and purposes, a die shrink. Yes, it will allow clock speeds to be increased. Yes, it will cut power consumption.

Raw performance, however, isn't going to change significantly.

That said, Sandy Bridge is a beast to attempt to compete with as is. These benchmarks, if real, may be slightly unreliable at this point. The current support of Bulldozer by the 8xx/9xx chipsets is unknown. They're obviously compatible, but current drivers may not be optimal. Similarly, these benches are from an Engineering Sample that may underperform the final product clock-for-clock, and the final product is intended to run at a 600MHz higher clock speed.

Bulldozer might be competitive with Sandy Bridge in all areas of the market if everything goes perfectly. I don't expect that, though.

I don't know that the FX-8130P is going to remain the top SKU when SB-E is released, either. I'd expect AMD to refresh it by then.
 

jmarti445

Senior member
Dec 16, 2003
299
0
71
That won't happen Intel has alot of wiggle room for clock speeds. If it hadn't been for the Chipset sataII mishap. I suspect we would have seen I higher binned K model with the release of Z68 M/Bs. I think its really amusing myself . Befor the price release of BD . People were making comparsions Between BD as the Extreme highend . Now were comparring to the lower highend as if nothing has changed . How did that happen? People were exspecting BD to Put a whipping to Sandy Bridge but instead we get revised expectations and the IDEA that nothing has changed . Which is pure Midrange hype NOW. NO true high end.

Why do I get the vibe I used to work with Nemesis 1 when I was a PC technician years ago(did you used to live in Baltimore)?
The only reason why I honestly stick with AMD over Intel for desktop is this. My rigs I've built using AMD chips 3 years ago can use the 6 core CPUs out today. I built my rig in September 2008 using a Phenom 9850, my system has taken every CPU upgrade I've thrown at it from a Phenom 2 X4 940, to a Phenom 2 X6 1090T(OC to 3.8Ghz for the last 6 months with zero downtime), without a sweat. Had I waited a few months I'd likely would have gone with the Core i7 920 CPU. I kind of got sick of intel for desktops back in the early to mid 2000s when they obsoleted their sockets about every 6-9 months. While the i7 900 series CPU sockets are a champ and literally will go to the Gulftown CPUs, that is the exception to the rule.
I know Intel had socket 775 for years but the chips would often require a new chipset and would require a new motherboard to accept the processor therefore had you bought a Prescott back in 2005, you wouldn't be able to necessarily swap in a Conroe CPU unless you had bought the chipset that came out along side of Conroe.
For me, thats the sole reason I've stuck with AMD since Socket A, they have had a grand total of exactly one CPU socket that didn't have a upgrade path(from single core to dual core and so on) and that was socket 754(wasn't impressed with it and quickly went to socket 939 when the Dual cores came out). Its a pain in the ass to do a motherboard swap when ever a new CPU comes out. Here is hoping that with Bulldozer, they are as good as AMD's previous CPUs when it comes to upgradability in the future.
All that being said, I once went to a AMD Turion Laptop and it totally sucked, battery life was terrible and performance was lackluster, Intel has that performance advantage, I'm not so certain how the Fusion processors will pan out into that, but a fusion processor isn't a top of the line product its more of a balanced product. I have a Gateway FX system for my laptop, I don't think that Fusion will get me to switch from Intel to AMD(I don't really use my laptop that often anyway to warrant a switch).
 
Last edited:

Rifter

Lifer
Oct 9, 1999
11,522
751
126
Why do I get the vibe I used to work with Nemesis 1 when I was a PC technician years ago(did you used to live in Baltimore)?
The only reason why I honestly stick with AMD over Intel for desktop is this. My rigs I've built using AMD chips 3 years ago can use the 6 core CPUs out today. I built my rig in September 2008 using a Phenom 9850, my system has taken every CPU upgrade I've thrown at it from a Phenom 2 X4 940, to a Phenom 2 X6 1090T(OC to 3.8Ghz for the last 6 months with zero downtime), without a sweat. Had I waited a few months I'd likely would have gone with the Core i7 920 CPU. I kind of got sick of intel for desktops back in the early to mid 2000s when they obsoleted their sockets about every 6-9 months. While the i7 900 series CPU sockets are a champ and literally will go to the Gulftown CPUs, that is the exception to the rule.
I know Intel had socket 775 for years but the chips would often require a new chipset and would require a new motherboard to accept the processor therefore had you bought a Prescott back in 2005, you wouldn't be able to necessarily swap in a Conroe CPU unless you had bought the chipset that came out along side of Conroe.
For me, thats the sole reason I've stuck with AMD since Socket A, they have had a grand total of exactly one CPU socket that didn't have a upgrade path(from single core to dual core and so on) and that was socket 754(wasn't impressed with it and quickly went to socket 939 when the Dual cores came out). Its a pain in the ass to do a motherboard swap when ever a new CPU comes out. Here is hoping that with Bulldozer, they are as good as AMD's previous CPUs when it comes to upgradability in the future.
All that being said, I once went to a AMD Turion Laptop and it totally sucked, battery life was terrible and performance was lackluster, Intel has that performance advantage, I'm not so certain how the Fusion processors will pan out into that, but a fusion processor isn't a top of the line product its more of a balanced product. I have a Gateway FX system for my laptop, I don't think that Fusion will get me to switch from Intel to AMD(I don't really use my laptop that often anyway to warrant a switch).

Both vendors have had bad and good runs with sockets. AM2/AM2+/AM3/AM3+ was good as long as your mobo makers released bios updates. 775 was great for intel and long lived as well, as long as the chipset was correct. 1366 also had a long and full lifespan. Both have good and bad platform runs over the years.

Unfortunatly AMD i think is going to screw people with bulldozer sockets because they have released slides showing the next gen BD(Komodo) going to the FMx socket. Which makes them creating AM3+ for BD retarded, especially after they initially said it would be AM3. So now you have to buy a new AM3+ board that will literally just be for one CPU clyce then upgrade mobo to FMx for any future updates.

http://www.xtremeshack.com/immagine/i97251_dektop-roadmap-2012-2.jpg
 
Last edited by a moderator:

tulx

Senior member
Jul 12, 2011
257
2
71
Gah. There BD an IB rumours ar killing me. I wish AMD had released BD on time and we'd just seen if it's crap or great. Now I'm torn between delaying my new system till whenever the hell BD comes out or just buying a 2500k based PC now.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
Both vendors have had bad and good runs with sockets. AM2/AM2+/AM3/AM3+ was good as long as your mobo makers released bios updates. 775 was great for intel and long lived as well, as long as the chipset was correct. 1366 also had a long and full lifespan. Both have good and bad platform runs over the years.

Unfortunatly AMD i think is going to screw people with bulldozer sockets because they have released slides showing the next gen BD(Komodo) going to the FMx socket. Which makes them creating AM3+ for BD retarded, especially after they initially said it would be AM3. So now you have to buy a new AM3+ board that will literally just be for one CPU clyce then upgrade mobo to FMx for any future updates.

http://www.xtremeshack.com/immagine/i97251_dektop-roadmap-2012-2.jpg

Meh, AMD sockets are pretty restrictive once you factor in BIOS support and TDP limits. Besides, for example, If you had a PhII 940 on a AM2+ board, would you even replace it with a new AM3 PhII X4 that is only clocked slightly higher? Why not? It's compatible with your old board!

I have used Socket 7/370/462 XP1700 /939 3500+ /775 E6300 /AM2+ 555BE / 1155 2500K and there was not a single time a compatible drop-in CPU upgrade over the existing chip would had made any sense for the money.
 

jmarti445

Senior member
Dec 16, 2003
299
0
71
Meh, AMD sockets are pretty restrictive once you factor in BIOS support and TDP limits. Besides, for example, If you had a PhII 940 on a AM2+ board, would you even replace it with a new AM3 PhII X4 that is only clocked slightly higher? Why not? It's compatible with your old board!

I have used Socket 7/370/462 XP1700 /939 3500+ /775 E6300 /AM2+ 555BE / 1155 2500K and there was not a single time a compatible drop-in CPU upgrade over the existing chip would had made any sense for the money.

Yes but a Phenom X4 940 to a Phenom X6 1100T is worth it, especially if you are into video encoding. I built my system in the Original phenom X4 era. The Phenom 2 X6 is a lot faster then that cpu. As for the 939 systems the X2s were a big boost over the original A64 systems. As soon as Hyper threading came out way back, for the desktop processors(not Xeon processors as they had it for a time before) multicore systems became a lot more relevant then they had back when they were a niche(dual processor systems like the dual 370 systems and the very rare Dual Socket A systems).
 

Arg Clin

Senior member
Oct 24, 2010
416
0
76
Unfortunatly AMD i think is going to screw people with bulldozer sockets because they have released slides showing the next gen BD(Komodo) going to the FMx socket. Which makes them creating AM3+ for BD retarded, especially after they initially said it would be AM3. So now you have to buy a new AM3+ board that will literally just be for one CPU clyce then upgrade mobo to FMx for any future updates.
CPU upgrade path shouldn't be overestimated.

It makes a lot of sense to release AM3+, so that people can get the board now with a cheap Athlon/Phenom cpu and be set for first gen BD. It's a short term socket bridge, I don't think anything else was ever implied. By the time that the first gen BD is obsolete so would a 990 based board I suspect.

As for BD support on the AM3 I, like many others I'm sure, would have liked this. However, if it's not going to work properly, AMD is doing the right thing by not going for it. Just imagine the intel fanboy crowd pointing out what low quality AMD is, should they release something that is not 100% rock solid. Better to go the Intel route and just cut off, since relatively few people really care about cpu upgrade path anyway. (IMHO)
 

Tsavo

Platinum Member
Sep 29, 2009
2,645
37
91
Can we stop talking about how Ivy Bridge is going to steamroll Bulldozer?

If Bulldozer is competitive with Sandy Bridge, then it will also be competitive with Ivy Bridge. Ivy Bridge is, for all intents and purposes, a die shrink. Yes, it will allow clock speeds to be increased. Yes, it will cut power consumption.

Raw performance, however, isn't going to change significantly.

That said, Sandy Bridge is a beast to attempt to compete with as is. These benchmarks, if real, may be slightly unreliable at this point. The current support of Bulldozer by the 8xx/9xx chipsets is unknown. They're obviously compatible, but current drivers may not be optimal. Similarly, these benches are from an Engineering Sample that may underperform the final product clock-for-clock, and the final product is intended to run at a 600MHz higher clock speed.

Bulldozer might be competitive with Sandy Bridge in all areas of the market if everything goes perfectly. I don't expect that, though.

I don't know that the FX-8130P is going to remain the top SKU when SB-E is released, either. I'd expect AMD to refresh it by then.

In the end, it doesn't matter who steamrolls whom in benchmarks. It will never matter at all if AMD holds the imaginary crown of the IT darling or the home OC crowd darling or the home HPTC darling. AMD has no fab capacity today to meet Intel on anything resembling an even playing ground. AMD will NEVER have the fab capacity required to stand toe to toe with Intel. AMD does cool stuff with their budget....er they did cool stuff likfe 5 years ago.

No fab capacity - no dice.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
There is a huge difference between old AM2 processors and the newer AM3 processors. I went from a X2-4200 65nm to a X2 240 45nm, and my system power went from 190 to 130 (max, measured by kill-a-watt). There is a huge difference in the amount of heat being pumped out of the back. This has never been mentioned by all the intel-loving tech sites, because you cant take an intel system and upgrade it like that. You always have to replace the mobo. Even on socket 775. Last time I tried to upgrade a socket 775 system, I was enraged for hours trying to get the stupid bios to update so I could install a conroe, and this was on a premium asus piece of crap.
 

bridito

Senior member
Jun 2, 2011
350
0
0
Can we please stop all this evaluation of products based on opinions of financial groups/people that are unable to see a train on a collision course until it is crashing into their noses?

I've been saying right through this thread that as of today we know nothing certain of BD's performance. I was just surprised to see that the AMD APU early sales hadn't been more pronounced.

Hey, be careful there, your magic ball might mislead you just like others did to millions prior to the Phenom launch. We cannot say what we don't know can we?

For the record. BD could have Cray or a Pentium II performance. To date, if anyone really knows, they aren't talking.

Sure, now create a poll and find out whos going to pay 500$ to 999$ for an 6c SB on a 350$ SB-E platform, let me guess, 1.2% ?

For the poll as to who will buy a $400 4c SB-E on a Silar or Thorsby mobo, please count me in. On the first day of release. Unless I am already loving my speedy BD system! :)

Very expensive. D:

Probably going to have some insane performance, though.

Intel will "likely" brand their top 6c SB-E as an insane extreme and charge their usual $999. But from what I've seen from "sheer speculation" on pricing the 100 MHz slower 6c is supposed to come in around $600 and the much faster per core 4c at $400. The 4c may turn out to be the overall favorite and I know it will be my favorite!

Don't know why they said that at least as of last week I could order a couple AMD models, both in consumer and business ends. Its also hard to talk about growth of a product that is only just now hitting the retail world. Just because I can't order an AMD CPU PC online through a specific manufacturer doesn't mean that they aren't developing a platform. Specially Dell, their interests in the business market doesn't allow them to skate out an infant system they have had little experience with. They are practically the last one to launch any specific comparative model in any market.

Were they referring to the APU models through Dell or just any AMD? Or maybe they were smokin' crack! :)
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
There is a huge difference between old AM2 processors and the newer AM3 processors. I went from a X2-4200 65nm to a X2 240 45nm, and my system power went from 190 to 130 (max, measured by kill-a-watt). There is a huge difference in the amount of heat being pumped out of the back. This has never been mentioned by all the intel-loving tech sites, because you cant take an intel system and upgrade it like that. You always have to replace the mobo. Even on socket 775. Last time I tried to upgrade a socket 775 system, I was enraged for hours trying to get the stupid bios to update so I could install a conroe, and this was on a premium asus piece of crap.

You know its not intels fault you are to incompetent to install a BIOS update right?
 

bridito

Senior member
Jun 2, 2011
350
0
0
I am posting this only to present recent information for the edification of the forum at large. Kindly do not interpret this as any form of favoritism. Read: Don't shoot the messenger! :)

http://seekingalpha.com/article/279...ace-amd-s-future-looks-grim-compared-to-intel

Even with AMD spending twice as much as Intel in all years except 2010, AMD has managed to be behind the curve on almost every microprocessor innovation. With increased competition from Intel as well as ARM Holdings (ARMH) with their fusion processor, AMD’s future is looking grimmer with less profitability by the moment.

My recommendation: Long INTC, short AMD.
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
I am posting this only to present recent information for the edification of the forum at large. Kindly do not interpret this as any form of favoritism. Read: Don't shoot the messenger! :)

http://seekingalpha.com/article/279...ace-amd-s-future-looks-grim-compared-to-intel

Even with AMD spending twice as much as Intel in all years except 2010, AMD has managed to be behind the curve on almost every microprocessor innovation. With increased competition from Intel as well as ARM Holdings (ARMH) with their fusion processor, AMD’s future is looking grimmer with less profitability by the moment.

My recommendation: Long INTC, short AMD.

Um, he measure R&D as a percentage of income.. AMD isn't spending more, they are spending a higher percentage of their Income (BIG difference).

Its like saying poor people must have more technology because they spend a higher percentage of their paycheck on tech than rich people do...
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
Did you read the article, and the authors comments? He writes:

The author is a complete moron.

:) I didn't read that. Yeah, author is a moron. I just read the "AMD spends more" and thought, "Hold the phone, there is no way AMD is spending as much as intel". Turns out, I was right. The author started talking about percentages to try and put AMD in a bad light.
 

bridito

Senior member
Jun 2, 2011
350
0
0
The author may be a moron but IMHO the poster of the information is a very nice Borg baby. Wewistince is fwetwile! :)

Question: Is AMD's proportionately greater R&D expenditure justified by the additional "ATI" biz, or is it just because they're so much smaller than Intel thus there are certain fixed costs that can't be avoided?
 

iCyborg

Golden Member
Aug 8, 2008
1,344
61
91
I posted a similar link last week where absolute numbers are given as well:
http://forums.anandtech.com/showthread.php?p=31952416&highlight=#post31952416

Requoting from the article:
AMD spends significantly more than Intel does on R&D as a percent of total sales in its attempts to keep up with the 800-pound gorilla of processors. However, Intel's huge size means that in absolute terms, its R&D spend is 4.8 times higher than AMD's. This need to pour such an outsized amount of revenues into R&D is part of what makes it so difficult for AMD to stay profitable. In 2008, for example, its R&D expense as a percentage of sales was double Intel's.
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
The author may be a moron but IMHO the poster of the information is a very nice Borg baby. Wewistince is fwetwile! :)

Question: Is AMD's proportionately greater R&D expenditure justified by the additional "ATI" biz, or is it just because they're so much smaller than Intel thus there are certain fixed costs that can't be avoided?

Costs are costs are costs are costs. In order for AMD to put out a CPU comparable to Intel CPUs, they have to invest about the same amount of R&D time as Intel's R&D time. Now, they can save on costs by always being behind the curve, but they can never escape the fixed dollar amount needed to create the product.

What percentage of their Income this number turns out to be, doesn't matter. It is a fixed value that is the price of competing in this market. AMD spends the same amount (even less really) of money on R&D as Intel, however, intel makes a much larger amount of money than they do.

If you have a dollar on you, and someone else has ten dollars, you buy something for fifty cents and the other person buys the same thing. Would you argue that you spent more money because, heck, you just spent 50% of your money on something while they only spent 5% on the same thing!

That is what the author is doing here.
 

bridito

Senior member
Jun 2, 2011
350
0
0
I posted a similar link last week where absolute numbers are given as well:
http://forums.anandtech.com/showthread.php?p=31952416&highlight=#post31952416

Requoting from the article:
AMD spends significantly more than Intel does on R&D as a percent of total sales in its attempts to keep up with the 800-pound gorilla of processors. However, Intel's huge size means that in absolute terms, its R&D spend is 4.8 times higher than AMD's. This need to pour such an outsized amount of revenues into R&D is part of what makes it so difficult for AMD to stay profitable. In 2008, for example, its R&D expense as a percentage of sales was double Intel's.

Unfortunately that msnbc link has since expired, but it's interesting that Intel spend almost 5x as much as AMD does and AMD is a much smaller company as determined by market cap: 4.42B vs 119.03B. However, the question I'm still not clear on is:

Does AMD's quoted R&D expenditure include CPU + Discrete Video (former ATI)? If so, it's R&Ding a lot more than Intel's CPU-only line (ok, let's ignore various relatively minor products such as SSDs and mobos).

Costs are costs are costs are costs. In order for AMD to put out a CPU comparable to Intel CPUs, they have to invest about the same amount of R&D time as Intel's R&D time. Now, they can save on costs by always being behind the curve, but they can never escape the fixed dollar amount needed to create the product.

What percentage of their Income this number turns out to be, doesn't matter. It is a fixed value that is the price of competing in this market. AMD spends the same amount (even less really) of money on R&D as Intel, however, intel makes a much larger amount of money than they do.

If you have a dollar on you, and someone else has ten dollars, you buy something for fifty cents and the other person buys the same thing. Would you argue that you spent more money because, heck, you just spent 50% of your money on something while they only spent 5% on the same thing!

That is what the author is doing here.

Your point is well taken, but that's reality. I eat at diners and fast food joints while Oprah and Carlos Slim have private chefs preparing fois gras with white truffles encased in 24k gold foil. And I still end up spending a lot more of my income (%) on food (greasy burgers) than they do. So if I'm looking at a really nice meal that doesn't give me indigestion I have to think about it very carefully, while to Oprah Fat and Carlos Slim buying Beluga caviar for a party for a thousand of their closest friends is the equivalent of me spending a nickel for a parking meter. I don't have to place much thought or commitment into my parking meter nickel, but the caviar for a thousand is likely more than my entire yearly income, so... yeah, I'd really have to think about that one! :)

That brings me back to BSN's (in)famous unnamed source:

http://www.brightsideofnews.com/new...-for-poor-bulldozer-performance.aspx?pageid=1

"Bulldozer is going to disappoint people because we did not get the resources to build a great CPU, and it's not that we needed billions of dollars to make it a leader. We needed investment in people, tools and technology."

It would seem to me, with my relative lack of understanding of the CPU biz, that if you have a company going toe to toe against one which is 27 times bigger in the same marketplace that it's just a matter of time until something cracks. Sure, there have been Davids triumphing against Goliaths before in the history of free enterprise, but they have been very few, very far between, and very damn lucky! :)
 

iCyborg

Golden Member
Aug 8, 2008
1,344
61
91
Unfortunately that msnbc link has since expired, but it's interesting that Intel spend almost 5x as much as AMD does and AMD is a much smaller company as determined by market cap: 4.42B vs 119.03B. However, the question I'm still not clear on is:

Does AMD's quoted R&D expenditure include CPU + Discrete Video (former ATI)? If so, it's R&Ding a lot more than Intel's CPU-only line (ok, let's ignore various relatively minor products such as SSDs and mobos).
Yes, it includes everything, but Intel also has graphics, it has SSDs, and remember that it also has fabs, so I don't think the absolute CPU arch R&D is any less of a difference.
 

Mr Vain

Senior member
May 15, 2006
708
1
81
If AMD has placed the pricing at a competitive position against the 2600K and the overall performance matches it, I have ZERO interest in it. I'm in the market for a CPU that shreds, not competes. That way I can keep it for 2-3 years and still have something at the end that is pretty close to state of the art.



IMHO all the BD benchys to date are well summed up by that statement. We don't know jack.

Officially we know jack, unofficially we may have an insight, and I don’t believe it’s all fake what’s out there about bulldozer performance. I remember when Conroe performance leaks started filtering thru to the web that many were saying that the figures were fake, as it turned out they were mostly legit, the same scenario may be unfolding here with bulldozer.
I for one will hold off any CPU upgrade till the official bulldozer figures are out.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Unfortunately that msnbc link has since expired, but it's interesting that Intel spend almost 5x as much as AMD does and AMD is a much smaller company as determined by market cap: 4.42B vs 119.03B. However, the question I'm still not clear on is:

Intel CPUs aren't 5x faster than AMD CPUs.

We don't know the performance of BD, but lets suppose it matches Nehalem clock per clock.

What will be the reaction of these forums?

"Fail!", "Crushing defeat", "AMD sucks!", "No reason to go AMD", etc.

But that means AMD will be 10% behind SB, so it depends of price, right?

If AMD is 10% behind NVIDIA performance but it costs 10-15% less, "big win", "AMD is competitive", "GTX580 is an overpriced turd", etc.

But in CPU land it seems to be different.

In fact we have some members that already declares BD to be a failure even if AMD matches SB because SB-E will spank it!
 
Last edited:

bridito

Senior member
Jun 2, 2011
350
0
0
Yes, it includes everything, but Intel also has graphics, it has SSDs, and remember that it also has fabs, so I don't think the absolute CPU arch R&D is any less of a difference.

Yes, you're absolutely right, but I was primarily referring to the former ATI biz which is essentially discrete GPUs. AMD had fabs up until the time that they spun it off, but I still am having difficulty seeing the 27:1 ratio. :(

Officially we know jack, unofficially we may have an insight, and I don’t believe it’s all fake what’s out there about bulldozer performance. I remember when Conroe performance leaks started filtering thru to the web that many were saying that the figures were fake, as it turned out they were mostly legit, the same scenario may be unfolding here with bulldozer.
I for one will hold off any CPU upgrade till the official bulldozer figures are out.

I am slightly less sanguine than you on the credit you're giving the current BD performance leaks. I'm tending to be more on the "je ne sais pas Jacques" side. :) I do fully agree with you fully on "I for one will hold off any CPU upgrade till the official bulldozer figures are out." Otherwise, we're all discussing whether the twist on the horn of the unicorn is consistent or constructed upon the Fibonacci. :)
 

bridito

Senior member
Jun 2, 2011
350
0
0
Intel CPUs aren't 5x faster than AMD CPUs.

We don't know the performance of BD, but lets suppose it matches Nehalem clock per clock.

What will be the reaction of these forums?

"Fail!", "Crushing defeat", "AMD sucks!", "No reason to go AMD", etc.

But that means AMD will be 10% behind SB, so it depends of price, right?

If AMD is 10% behind NVIDIA performance but it costs 10-15% less, "big win", "AMD is competitive", "GTX580 is an overpriced turd", etc.

But in CPU land it seems to be different.

In fact we have some members that already declares BD to be a failure even if AMD matches SB because SB-E will spank it!

Respectfully, it is my own personal opinion and specifically from my perspective that if BD does not overall outperform 2600K, it will be a letdown. If it outperforms 2600K by a margin which on launch date seems to be at least close to the anticipated approximate performance of SB-E Quad, then I'm going to forward my money to the nice folks at AMD! :)
 

yottabit

Golden Member
Jun 5, 2008
1,619
741
146
Honestly I think BD is going to be a marketing goldmine. Even if an Intel SB quad core is much faster at single threaded tasks: picture this scenario (let's assume BD is faster than 2600k similarly priced for multithreaded apps, but worse at single threaded)

Customer is comparing a 2600k and FX-8xxx computer

The customer will probably hardly be able to notice any speed difference in testing them both out
Bulldozer has 8 cores
Bulldozer has higher clockrate

Try explaining to them that they should get the 2600k because it's a little bit faster at single threaded applications

I guess I'm really hoping that a 8 core BD will be SLIGHTLY behind a stock 2600k in single threaded apps, and significantly above it in multithreaded. In that case even a slightly lower clocked 8 core BD should destroy a 2500k in multithreaded.

Similarly with 4 core BD vs core i3

It's just hard to recommend an AMD quad right now IMO when the i3 is so so much faster at single threaded apps

I personally like threads like this... I'm not trying to predict what BD is or how great it will be, I just like envisioning the different scenarios that could unfold
 
Status
Not open for further replies.