I think they would be a little worse off....They needed to do this or actually spend some money advertising and getting the word out on the mhz myth. Since they didn't want to invest in that they needed this to give the impression they are neck and neck with Intel...For the most part it as been successful, but now IMO it is starting to get to whacked. MOst have always thought it is directed toward a P4 and I don't think AMD as even tried to correct any on that and let their PR dept play it like that. Whether or not it is a Tbird comparison the fact is it starting not to work anymore against the P4 and that will confuse consumers more and could lead to negative feedback...
AMD needs to pull head out of arse and correct the PR rating cause the TBird is just a little to far back in the day to be competing against. Notce how in cpu intesive programs these Bartons with their whacked pr don't even beat the previous xp version. Notice how the opteron and even the athlon64 in their previews in some apps were equal to the performance of a same speed xp (true mhz not pr). that sad thing it was a long ways ahead in pr rating which no amount of tweaking and optiization realistically will make up....
I think the hammer may push the pr rating considerably off even more....
Maybe the intel should start using a pr rating with the prescott and p4c's considering they use HT, 800mhz fsb , and in the prescott case 1mb of l2 cache and more l1 cache. If there was a similar speed 533fsb chip it would be a few hundred mhz off in equal performance...
Prescott 3.2ghz becomes a 3500+p4xp.....How about that!!!!
INtel has upgraded the p4 to the northwoods .13proces, 533fsb, 512kb of l2 cache, HT, then 800fsb, soon to be HT2, soon to be 1mb of l2 cache, and soon to be .09process.....Maybe a p4 prescott should be compared to a willamette p4 original...It would be something like closer to 4000+...
However there arte just programs that are plain and simple just raw mhz and don't take advantage of the memory bandwidth, the extra cache, and the HT....At least Intel doesn't get caught in that quandry...