Amd K10 B3 stepping tanks.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
ONe thing I have to say about that article. The first part was informing...the rest of the article not so impressed with. I never seen someone try so hard to spin it well. Just the choice of words to describe the phenoms low performance in somethings and competitive in others. AMD does poorly by 20-40% it is "not so good","mildly disappointing", "does not show stellar performance but it isn't too bad either", or the famous "Intel has always done better here" (no mention of mainconcept and how AMD has done there for awhile). AMD wins by 5-10% it is "wins here by quite a margin".

NIce spin job....I like the test (choice of). I think his commentary was definitely lacking. I have always hated how people lack balance in calling out the difference. Just stick to performance percentage then try to spin in an excuse about clock speed, single threaded, intel has always done better (hint, hint, cheating!!! or paid off benchmark developer),etc.

Bottomline for me even the older kentsfield with an older 1066fsb and older platform clock for clock does better.....factor in the increase of the yorkfield, and the huge wall currently with phenom ocing and it isn't even really much of a game.
 

hans007

Lifer
Feb 1, 2000
20,212
18
81
the b2 bug apparently only happens on vms with a hardware hypervisor.

it would be like a bug that only affected core 2 duos that have VT enabled in vmware esx or something.

for the vast majority of people it wouldnt even matter.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Apparently, it is a myth according to Moorehead of AMD.

Linq

"When the rumours started to fly, I called AMD and talked to Pat Moorehead who answered all of my questions. The B3s are not bugged, period. The fix was tested on B2 silicon and verified (Don't ask, the elves who work at the Fraternal Order of Silicon Repair Mythical Creatures Local 17 get antsy if you reveal their secrets).

Basically there is no B4, there won't be a B4, and the next stepping is Cx. The Cx parts are the 45nm Shanghai cores, so the B3 will be the end of the line for 65nm. B3s are going to be sampling in the not so distant future, production in the end of Q1, and retail availability in Q2. That part is a slip."

I don't know where these rumors begin, grow, metamorphisize or end, but those that start them should be lightly electrocuted (non fatally of course) to get their neural pathways and dendrites firing in the proper order. A little shock therapy never hurt anyone. :D

Is Pat Moorehead flat out lying? Dunno. Let's ask Nemesis. Nemesis?
 

ayabe

Diamond Member
Aug 10, 2005
7,449
0
0
*SNIP*

Sorry didn't see the INQ link right above.

Who knows where the truth lies.

I truly hope we can get a decent chip from AMD before the middle of the year.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: keysplayr2003
Apparently, it is a myth according to Moorehead of AMD.

Linq

"When the rumours started to fly, I called AMD and talked to Pat Moorehead who answered all of my questions. The B3s are not bugged, period. The fix was tested on B2 silicon and verified (Don't ask, the elves who work at the Fraternal Order of Silicon Repair Mythical Creatures Local 17 get antsy if you reveal their secrets).

Basically there is no B4, there won't be a B4, and the next stepping is Cx. The Cx parts are the 45nm Shanghai cores, so the B3 will be the end of the line for 65nm. B3s are going to be sampling in the not so distant future, production in the end of Q1, and retail availability in Q2. That part is a slip."

I don't know where these rumors begin, grow, metamorphisize or end, but those that start them should be lightly electrocuted (non fatally of course) to get their neural pathways and dendrites firing in the proper order. A little shock therapy never hurt anyone. :D

Is Pat Moorehead flat out lying? Dunno. Let's ask Nemesis. Nemesis?


Hay I didn't make the article don't shot the messenger.

Keys ask is Pat Moorehead lying? I don't know. Someone is. With what we have heard and seen from AMD in the last 11/2 years . The Odds are 95% AMD is lieing again . Product released in Sept . Availability in May . Unless you want to buy damaged goods.

Something is going on . But with all the disinformation out their and the AEG type assults on Intel cpus only leads me to believe AMD is in a world of hurt.

But they will survive. They have IBM . It's to bad it was Dec. that went down and not IBM.

 

geokilla

Platinum Member
Oct 14, 2006
2,012
3
81
Originally posted by: Sylvanas
Originally posted by: Idontcare
Originally posted by: Sylvanas
Originally posted by: Arkaign
Originally posted by: Sylvanas
If you want a drop in quad core for an existing AM2 rig then might as well get it, the 'bug' only appears in very specific circumstances yet it gets blown out of proportion most of the time- unless you run a server then theres no problem.

The only problem is that the X2 often performs better than Phenom, for example, a 3+ghz X2 (easily done with even cheapo 4x00+ X2s), will perform better in most tasks next to a Phenom @ 2.3-2.4ghz.

This in depth review compared the Phenoms to just about everything and as you can see they finish ahead of X2's by a long shot. Even squeezing 400mhz out of an X2 and running the benches again would not bridge the gap in performance.

Awesome article, thanks for linking and sharing, I would have not likely have seen it otherwise.

Regarding the comments...would you agree your assessment is true for only multi-threaded apps and not true for single or dual-threaded apps?

Well that's a good question and in most cases probably true. Lets use Farcry (single threaded) in the review as an example, see that the lower clocked Phenom 9600 has around the same performance as the X2's which are at or around the same frequency, on the other hand the Intel processors pushing 3ghz are top of the table in that single threaded bench, so it that case then maybe squeezing extra mhz from an X2 would be worth it over a Phenom. However we're only going to get more and more multithreaded apps as time goes on in which case the Phenom would be the better pick evident in the UT3 bench where the Phenom is ontop of even the QX9770 and ahead of the FX62 and other duals by quite a margin.

There's also this article comparing a single core Phenom to a single core Athlon. In short, when clocked at the same speeds, the Phenom is faster than the Athlon.

As for the TLB bug, does it really matter to us? I don't think we'll ever encounter the TLB bug when the computer is used like any other consumer desktop computer around the world. Of course, I'll gladly take a Phenom X4 or even X3 with the TLB bug if the price is right.
 

GundamF91

Golden Member
May 14, 2001
1,827
0
0
Now that I've jumped ship from AMD, after using T-bird/XP/Athlon64 for past 6 years, I can see the superiority of current Intel Core2Duo processors. But I do hope AMD recovers, because without a competitor, Intel's not going to innovate as much as they've done, nor will there be good pricing options for customers.

 

konakona

Diamond Member
May 6, 2004
6,285
1
0
I am actually shocked to see how well phenom did in UT3 bench in that Lost Circuits review, it toppled the intel quad cores with ease :Q Does this say something about their multicore efficiency?

We need more games like UT3 and some healthy clock bump to feel comfortable about ditching intel CPUs and moving back to AMD.
 

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
I think when we start seeing more code that is truly optimized for multi-core processors Phenom will shine. This was definitely a more forward thinking project than the Core 2 Duo.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Interesting . Could you expand on this. How soon are these new apps going to arrive. Before Nehalem . Lets see we get K9 in May and Nehalem in 0ct. Do you really think coders will write for multicore without using SSE4 ware intel performance gain are substainual. One most be optimistic but one also needs to look at the big picture. The delay of K9 cpu's for servers is a disaster for AMD.

Its hurting their lawsuite as intels att. will say AMD can't compete and they be right. AMD is losing market share . In 5 months it will be at levels = to those befor K8.

And it would have nothing to do with Intel paying off PC venders. Yes Dell is feeling Intels wrath . But Intel found a PC maker that can run exclusively intel cpu's . Its interesting to note Apples staggering market share gains. Apple will be the First to have notebooks out due to its exclusive to Intel . I deal that made Dell an empire. Which Dell no longer can tout. Is Intel treating Apple First in line legeal . You bet it is.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: OneOfTheseDays
I think when we start seeing more code that is truly optimized for multi-core processors Phenom will shine. This was definitely a more forward thinking project than the Core 2 Duo.

The concern here should be that by the time mutli-core software starts becoming commonplace in the desktop environment the software won't be running on Yorkfield's and Phenom's.

Nehalem will be out long before software catches up to the core wars.

Think of the lead time the K8 has had with 64-bit capability...desktop software took ages to catch up (if you call Vista64 a finished product) and certainly by the time it did it wasn't a question of whether it enabled the K8 to be all the more competitive against a P4C. Hardware moved on well in advance of the software.
 

Accord99

Platinum Member
Jul 2, 2001
2,259
172
106
Originally posted by: konakona
I am actually shocked to see how well phenom did in UT3 bench in that Lost Circuits review, it toppled the intel quad cores with ease :Q Does this say something about their multicore efficiency?
No, it says something about the review itself, given that only a 7900GTX was used and that frame smoothing was not disabled, thereby capping the maximum frame rate at ~62 fps.

 

justly

Banned
Jul 25, 2003
493
0
0
Originally posted by: Accord99
Originally posted by: konakona
I am actually shocked to see how well phenom did in UT3 bench in that Lost Circuits review, it toppled the intel quad cores with ease :Q Does this say something about their multicore efficiency?
No, it says something about the review itself, given that only a 7900GTX was used and that frame smoothing was not disabled, thereby capping the maximum frame rate at ~62 fps.

Just curious, since the 7900GTX was used on both the highest performing Intel and AMD systems what do you see as a problem?
Also, is there something frame smoothing does (besides capping frame rate) that would be an advantage to one processor brand over the other?

Seeing as the frame rate is capped the Phenoms higher frame rate does seem to indicate one thing, it can maintain a more consistent frame rate then the top of the line Intel quad core counterparts (costing 4x as much).

Disclaimer- I'm only voicing my opinion that I still believe the (LostCircuits) benchmark is showing some useful information (even if it is run differently than other review sites). I also do not claim that this effects game play for this game, or any other game or app, that is up to each person to decide for them self.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: Accord99
Originally posted by: konakona
I am actually shocked to see how well phenom did in UT3 bench in that Lost Circuits review, it toppled the intel quad cores with ease :Q Does this say something about their multicore efficiency?
No, it says something about the review itself, given that only a 7900GTX was used and that frame smoothing was not disabled, thereby capping the maximum frame rate at ~62 fps.

We don't need to go above 62fps to see the trend, its already existent in the results taken.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: justly
Originally posted by: Accord99
Originally posted by: konakona
I am actually shocked to see how well phenom did in UT3 bench in that Lost Circuits review, it toppled the intel quad cores with ease :Q Does this say something about their multicore efficiency?
No, it says something about the review itself, given that only a 7900GTX was used and that frame smoothing was not disabled, thereby capping the maximum frame rate at ~62 fps.

Just curious, since the 7900GTX was used on both the highest performing Intel and AMD systems what do you see as a problem?
Also, is there something frame smoothing does (besides capping frame rate) that would be an advantage to one processor brand over the other?

Seeing as the frame rate is capped the Phenoms higher frame rate does seem to indicate one thing, it can maintain a more consistent frame rate then the top of the line Intel quad core counterparts (costing 4x as much). Disclaimer- I'm only voicing my opinion that I still believe the (LostCircuits) benchmark is showing some useful information (even if it is run differently than other review sites). I also do not claim that this effects game play for this game, or any other game or app, that is up to each person to decide for them self.

I couldn't agree more. That UT3 result looks great . Is this the first review of phenom where that game was Used? Also I thought the results of power usage was great on Phenom other than no other reviewers anywere have shown such results. UNbelieable man. I bolded the part above I don't understand. Are intels Lowend penryn parts $1000 dollars. I thought smart enthusiast bought lowend parts and make them run better than top end. How far can you O/C that Phenom must be great right. Whats going around the forums is most the guys are hoping to run Quad penryns at 3.6+ and the wolves @ 4.0ghz.

AS far as Dell boxes and such Phenom is going to make a great under $700 dollar quad pc . But when a wolf with 2cores almost equals K10 quads. And are much more power efficient. It will be a good battle in this segment. Than we have the manufactures themselves. They decide what cpu goes in each segment. If you quit thinking with your heart for a second and use your minds. Phenom can only compete in the lowend and midrange. The high end is intels all alone. So is the midrange to be frank about it.

I love gaming as much as the next guy . But its real hard for me now. But gaming doesn't determin my cpu choice. To me Pcs are complete instruments of retreavel and storage. Entertainment center and business machine. I want a cpu that can do a lot of work at once in a timely fashion efficiently. Video editing is a big deal to my wife and myself. I have a hugh media file . And I keep it all on a secure pc. If I pend a day riping or coping I want raw horsepower.





 

konakona

Diamond Member
May 6, 2004
6,285
1
0
Originally posted by: Accord99
It has little relevance, it would be like racing two cars on a track to see which one is faster, but placing a speed limit and a rev limiter on the two cars.
excpet that we would put those two cars to see which one can gurantee higher speed at the worst situation, not at the max speed we dont realistically need.

Plus the trends don't agree with other UT3 benchmarks, like Xbitlabs and Anandtech (who both use an 8800GTX).

http://www.xbitlabs.com/articl...d-phenom_11.html#sect0
http://www.anandtech.com/cpuch...howdoc.aspx?i=3153&p=9
yeah, part of why this was a bit of a shocker to me. Seeing how knowledgeable ATers acknowledging the quality of the lost circuit's review, I cannot discrecit it just because it differs from other website's views. True that they used different video cards, so the other two reviews you link to are surely more relavant though.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: Accord99
It has little relevance, it would be like racing two cars on a track to see which one is faster, but placing a speed limit and a rev limiter on the two cars.

Plus the trends don't agree with other UT3 benchmarks, like Xbitlabs and Anandtech (who both use an 8800GTX).

http://www.xbitlabs.com/articl...d-phenom_11.html#sect0
http://www.anandtech.com/cpuch...howdoc.aspx?i=3153&p=9

Thats a valid point, the only thing I can think of that could cause such a difference is both Xbit and AT were running 1024 vs lostcircuits 1280 (with different GPU;s) and both Xbit and AT were using the UT3 Demo (does this have hardware Physics or is it as optimized as the full version?). I know when the Crysis demo came out there was apparently little to no multicore support compared to retail I am just wondering if that would be the case with UT3.
 

justly

Banned
Jul 25, 2003
493
0
0
Nemesis 1, as for what you bolded in my post , let me explain.
For the Phenom to have a higher frame rate it either has to be able to reach a higher frame rate than the Intel counterpart (some thing it shouldn't be able to do with a cap on the max frame rate, assuming both can exceed the cap), or it can't drop as many frames (this would cause its average to drop below the cap). Since even the Phenom 9600 scores 3 FPS faster than the QX9770 the only logical conclusion that can be made is that the Intel parts are dropping frames.

I see this as a limitation of latency associated with Intels FSB design when prefetching doesn't properly predict the required data. However, I am willing to acknowledge opposing arguments to this. I also have to reiterate my disclaimer-"I also do not claim that this effects game play for this game, or any other game or app, that is up to each person to decide for them self". Personally, Intels prefetcher looks outstanding, however when it doesn't properly predict needed data it still suffers the latency hit associated with a FSB design, and I don't think this hit can be avoided until Intel implements its own IMC.

I'm not sure how to address some of your other comments except to say, it's not my desire to chose the solution that fits your or anyone else's needs. I know some people don't overclock and others do (to varying degrees), and everyone has different needs and priorities regarding performance.

All I do is provide my observations with the hope that something useful can come of it, even if that means I am shown to be wrong and given the chance to learn something.
 

justly

Banned
Jul 25, 2003
493
0
0
Originally posted by: Accord99
It has little relevance, it would be like racing two cars on a track to see which one is faster, but placing a speed limit and a rev limiter on the two cars.

Actually most auto racing does put limits on the cars to eliminate unfair advantages. In some way the cap on these benchmarks does something similar by focusing attention on minimum frame rate, or do minimum frame rates no longer matter?


Plus the trends don't agree with other UT3 benchmarks, like Xbitlabs and Anandtech (who both use an 8800GTX).

http://www.xbitlabs.com/articl...d-phenom_11.html#sect0
http://www.anandtech.com/cpuch...howdoc.aspx?i=3153&p=9

This only goes to show that both of the systems in the LostCircuits review should be equal, and running at the frame rate cap unless one is suffering from latency issues.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
So we take results based on a review using the state of the art Cpus from both companies. Than they use a GPU that gives them the results they want to present.

AT reviewof the UT3 Beta. Showed intel Q6600 2.4ghz. beating 9900

204fps to 166.1

In oblivion Bruma q6600 got beat by both the 9900 and 9700 but neither could match Q6700 .

Crysis demo

Q9450> 9900 9900> Q6700 q6700> 9700 9700/9600> Q6600.

Once you increase res. scores bunch up same as always. Till your GPU limited.

Until results In that UT3 test can be run by more reviewers . That come up the same results . right now uts a reviewers mistake. Until others confirm.
 

Accord99

Platinum Member
Jul 2, 2001
2,259
172
106
Originally posted by: justly
Actually most auto racing does put limits on the cars to eliminate unfair advantages.
These series may try to limit performance by increasing weight or limiting technology to make the racing more entertaining, but none of them have a speed limit.

And in the computing world, there is no such thing as unfair advantages.

In some way the cap on these benchmarks does something similar by focusing attention on minimum frame rate, or do minimum frame rates no longer matter?
But it's not a review of minimum frame rates, it's a review of average frame rates hampered by an old video card and a hard maximum frame rate cap. A real review of min. frame rates would have used a fast video card, disabled the frame cap then provided the distribution of frame rates that's available in the log.
 

justly

Banned
Jul 25, 2003
493
0
0
"Than they use a GPU that gives them the results they want to present"

let me get this strait, if I where to use the same GPU in two different systems resulting in system 1 scoring higher and I want the other system to have a higher score all I have to do is change out the GPU in both systems to an older model from the same manufacturer and then system 2 will be faster? Somehow I don't believe you.

"right now uts a reviewers mistake. Until others confirm."

I would have to disagree, its not a mistake, it represents different information (maybe information you don't want to hear, and thats why you label it a mistake).

"And in the computing world, there is no such thing as unfair advantages"

If there is no such thing as unfair advantages why are you complaining about what video card was used or if frame smoothing was enabled?

"But it's not a review of minimum frame rates, it's a review of average frame rates hampered by an old video card and a hard maximum frame rate cap. A real review of min. frame rates would have used a fast video card, disabled the frame cap then provided the distribution of frame rates that's available in the log."

I agree that this isn't a minimum frame rate test, actually I would be grateful if you could provide a link to such a test, otherwise this is the best information I have come across to compare minimum frame rates between these processors.
 

Accord99

Platinum Member
Jul 2, 2001
2,259
172
106
Originally posted by: justly
"And in the computing world, there is no such thing as unfair advantages"

If there is no such thing as unfair advantages why are you complaining about what video card was used or if frame smoothing was enabled?
Because if you want to test for CPU performance, then you don't test for performance with a frame rate limit and a slow GPU, rendering the benchmark useless.