Intel's secret weapon against Hammer?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jjyiz28

Platinum Member
Jan 11, 2003
2,901
0
0
Originally posted by: human2k
How do you know Hammer is NOT gonna be a flop? You got any benches that prove Hammer is SUPERIOR to intel's products?

who knows?? just a guess. i feel its a marketing edge at the very least. just like the intel has faster mhz over athlon chips, a64 might be marketed the first 64bit desktop cpu while p4,p5 is only 32 bits.
 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0
Idontcare...

First off, when one is sarcastic about what someone else has said, it's usually taken to mean that the sarcastic one does not agree with the other's statement. I don't think I've ever experienced different.

Second off, being sarcastic is a very passive-agressive way of communicating. It's unfair to say that others are "villifying" you, when you come into a discussion with that tone to your posts.

Third, brilliant post about having "vision", or the lack thereof. (And no, I'm not being sarcastic.) So many times I read comments on msg boards like, "Great, now I can run Quake 3 at 300fps", or "Nobody NEEDS a cpu above XXXmhz."

I'll be the first to tell you that long term vision doesn't come naturally to me. And I am amazed by those who do see beyond the boundaries.
 

grant2

Golden Member
May 23, 2001
1,165
23
81
Originally posted by: Wingznut

I don't suppose you'd mind explaining to us what part of Sohcan's FAQ is inaccurate? Or exactly how 64-bit will speed up your desktop experience?

I think he downplays the value of performing 33-64 bit integer & FP math in 1 operation, vs. the current multisteps

I also remember reading an article 13 years ago where the author discussed the speed & program size increase MERELY from recompiling 16 bit -> 32 bit. Now what does 16bit -> 32bit offer that 32 -> 64 doesn't?
 

paralazarguer

Banned
Jun 22, 2002
1,887
0
0
That was explained in the post above where I quoted the FAQ. I don't know if you read it or not. It explains it.
 

imgod2u

Senior member
Sep 16, 2000
993
0
0
Originally posted by: grant2
Originally posted by: Wingznut

I don't suppose you'd mind explaining to us what part of Sohcan's FAQ is inaccurate? Or exactly how 64-bit will speed up your desktop experience?

I think he downplays the value of performing 33-64 bit integer & FP math in 1 operation, vs. the current multisteps

Going "64-bit" only involves a shift in the Integer data types. The FP processors has its own registers and are currently capable of up to 80-bit FP data in modern processors.

I also remember reading an article 13 years ago where the author discussed the speed & program size increase MERELY from recompiling 16 bit -> 32 bit. Now what does 16bit -> 32bit offer that 32 -> 64 doesn't?

It's not a linear scaling. Going from 16-bit to 32-bit increased the possible address space from 65MB to 4 GB. It also increased the maximum number used per data from 65 million to 4 billion. Even assuming that software usage increases linearly (i.e. let's say the memory required doubles every 2 years), that's still a lot of headroom when going from 16-bit to 32-bit. Following this growth rate of memory usage vs memory addressability, to reach the point where we'd need 64-bit computing would take longer to reach than the time it took for 16-bit to become a limiting factor.
Currently, I think we are reaching that "limit", but not in the desktop market. Not many consumers need 4 GB of memory, let alone more.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Wingznut
Idontcare...

First off, when one is sarcastic about what someone else has said, it's usually taken to mean that the sarcastic one does not agree with the other's statement. I don't think I've ever experienced different.

Second off, being sarcastic is a very passive-agressive way of communicating. It's unfair to say that others are "villifying" you, when you come into a discussion with that tone to your posts.

Third, brilliant post about having "vision", or the lack thereof. (And no, I'm not being sarcastic.) So many times I read comments on msg boards like, "Great, now I can run Quake 3 at 300fps", or "Nobody NEEDS a cpu above XXXmhz."

I'll be the first to tell you that long term vision doesn't come naturally to me. And I am amazed by those who do see beyond the boundaries.

Wingznut

1. That's cool. Sarcasm is also part of the Socratic philosophy of teaching. Not that I was necessarily trying to teach, but do be aware that sarcasm is routinely and deliberately used to inspire the recipient to (a) repute the sarcastic statement w/ or w/o logic out of their sense of rage and flushed cheeks (feeling of provocation) and (b) think about the topic a little more as they lash out to debunk their ignorant/arrogant teacher. The role-playing mechanism of "playing devil's advocate" is but one example of this. Just remember, sometimes you have to play the ignorant bastard before you can convince someone to think differently. Don't assume all your experiences with sarcasm were premised with the intent of slanting someone. If done correctly though, then you will feel like you have bested an ignorant/arrogant fool whilst the teacher (aka fool in your mind) must resign themselves to the self-satisfaction of knowing you have grown as they can never let you in on the secret.

The take home message it that not everyone takes a sarcastic stance because they disagree with anything or anyone. I am sorry to see you write that you may have never realized this because you seem well educated so I am sure there are a great number of teachers with whom you have battled opinions with and thought you won. The teachers won those battles, because you had to grow as a person to advance and posit your rebuttals in an intellectual manner to repudiate their apparent ignorance. This was likely their goal.

2. Cool again, but I refer you to item #1. Not all sarcasm is passive-aggressive because not all sarcasm is as you interpret it to be. On an unrelated topic, how many psychology courses have you taken to fully comprehend the embodiment which is passive-aggressive? I brought up the issue of vilification because once you decided that the premise of my post was to troll your position was not to consider the topic at hand but rather to attack the character of the person in question. Sorry if you think I am unfair. Sorry if you decided not to like the tone of my post. I hope you exhibit more tolerance for diversity in people's opinions as well as delivery mechanism (be it written, spoken, Socratic, etc.) in your personal life than you are demonstrating in this thread. I'm not your mother; I'm not going to try to tell you to play nice with the new kid in the class. How you react is a mixture of your predisposition to new people showing up in the forum and your own tolerance for people you deem more ignorant or arrogant than yourself. I've certainly learned a great deal about you and paralazarguer in the both the manner from which you perceived my post and the way you personally went about reacting to it. Not making a judgment call man, just saying the tolerance you both exhibit is, um, interesting. When perceiving danger, the decision is fight or flight, you two chose fight (regardless of why you perceived danger) which says something about you feeling the need to assert dominance and retain control. Maybe out of fear of loosing face in a forum that you've spent so much time building a reputation and large # of posts? Either way, you've done what you've done, you reacted the way you reacted and it is immortalized in the forum archives.

I am sorry I made you do that, I am sorry you felt it was necessary to do that :(. I am sure you and paralazarguer are both really nice people, but man did I ever rub you two the wrong way! Lesson learned, nuf said. :cool:

3. The nice thing about society is once you stop worrying about where it's going, you get to sit back and let it take you wherever it is going to go anyways. Being the driver may be cool and self-rewarding, but sometimes it is really nice to just be the passenger because then you get to sitback, look out the window at the fancy scenery, and no one really cares how drunk you are so long as your seatbelt is on ;)

Not going to post again in this thread as I've turned it into a silly silly OT rant...catch you on another thread.
 

KenAF

Senior member
Jan 6, 2002
684
0
0
Just some updated info....

At ISSCC 2003 presentation 19.7, Intel confirmed the Prescott die was 10.7 mm x 10.2 mm, or 109mm^2. See updated info on chip-architect.com. By comparison, the 256Kb L2 version of Clawhammer (on .13) is 104mm^2.

At it's 2001 Annual Analysts Meeting, AMD predicted that Intel's next Pentium on .09 micron would be in the range of 80mm^2. It seems they underestimated...
 

Sohcan

Platinum Member
Oct 10, 1999
2,127
0
0
Originally posted by: grant2
Originally posted by: Wingznut

I don't suppose you'd mind explaining to us what part of Sohcan's FAQ is inaccurate? Or exactly how 64-bit will speed up your desktop experience?

I think he downplays the value of performing 33-64 bit integer & FP math in 1 operation, vs. the current multisteps

I also remember reading an article 13 years ago where the author discussed the speed & program size increase MERELY from recompiling 16 bit -> 32 bit. Now what does 16bit -> 32bit offer that 32 -> 64 doesn't?

I wasn't trying to downplay anything. It's been quite recognized for a long time (among engineers and architects, not PR types) that 64-bit microprocessors add additional functionality and a larger flat memory space rather than an explicit performance increase due to bit-level parallelism. The FAQ that I wrote a while back explains the diminishing returns in performance from the additional bit-level parallelism with 64-bit arithmetic.

I was merely trying to dispel the myth that microprocessors somehow "operate" on some fixed-sized dataset, and that 64-bit microprocessors can somehow "churn" through the data at twice the rate, and that this will somehow lead to a direct speedup in desktop applications. There are certainly big-iron applications where arithmetic on 64-bit datatypes is more common, but this hardly relates to a proportional speedup with a 64-bit microprocessor. For example, 186.crafty is a chess playing program in SPECint (an industry-standard workstation benchmark) that heavily uses 64-bit datatypes. Despite this, the performance of the P4 and Athlon in crafty is in line with their total SPECint score (base) with respect to other 64-bit microprocessors. In 186.crafty, the 3.06 GHz P4 scores 1160, the 2800+ Athlon XP scores 1311, and the 1.45 POWER4+ (IBM's 64-bit server microprocessor) scores 941. Their respective total SPECint scores are 1099 (P4), 898 (Athlon XP) and 909 (POWER4+).

For a supporting opinions, here's an excerpt from the introduction of the textbook for my graduate class in parallel computer architecture (Parallel Computer Architecture, David Culler and Jaswinder Singh):

"The period up to about 1986 is dominated by advancements in bit-level parallelism, with 4-bit microprocessors replaced by 8-bit, 16-bit, and so on. Doubling the width of the datapath reduces the number of cycles required to perform a full 32-bit operation. Once the 32-bit word size is reached in the mid-1980s, this trend slows, with only partial adoption of 64-bit operation obtained a decade later. Further increases in word width will be driven by demands for improved floating-point representation [not an issue, since x86 has supported 64-bit and 80-bit FP modes for two decades] and a larger address space rather than performance (emphasis added). With address space requirements growing by less than a bit per year, the demand for 128-bit operation appears to be well in the future."

Here's HP's page on 64-bit computing. Note, under the benefits section, the emphasis on increased functionality, precision, and performance due to increased memory addressability, rather than explicit increased performance due to decreased latency 64-bit datatypes operations. Also note the emphasis on the usefulness for database systems (OLTP), decision support systems, and high-performance technical computing. Finally, note how at the end the is made clear that programs that do not need 64-bit datatypes should be compiled with 32-bit datatypes on 64-bit microprocessors.
 

tinyabs

Member
Mar 8, 2003
158
0
0
Will Intel release a desktop version of Itanium or they will push the envelope of 32-bit processors in future?

 

RU482

Lifer
Apr 9, 2000
12,689
3
81
Originally posted by: forcesho
all i can say is that i spend a lot of money cruching numbers in excel and I bet ya, this wont be fast enough... i got a few dual 2.8 xeons and crap and excel is still slow..

EXACTLY!! fvcking excel
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: tinyabs
Will Intel release a desktop version of Itanium or they will push the envelope of 32-bit processors in future?

Doubt it. Intel has successfully maneuvered themselves to the point where they have the critical mass to support multiple chip designs to generate their market segmentation in addition to sub-market segmentation.

Specific 64bit core for the 64bit server/workstation market. Itanium w/Performance and w/Budget memory sub-system.

Specific 32bit core for the 32bit PC Desktop market. P4 w/Performance and w/Budget memory sub-system.

Specific 32bit core for the 32bit mobile market. Banais w/Performance and w/Budget battery-power consumption.

You will not likely see cores migrate between their 3-way market segmentation. They will continue to migrate (performance vs. Celeron) within their respective submarkets. Only way 64bit gets into Intel's desktop segment is if P5 or P6 is designed to have it.

AMD doesn't have critical mass to support multiple cores in production (not enough parallel designer groups) so their strategy will remain to use one core hitting across the three major segments and further bifurcating into their sub-segments (Athlon 64, Opteron, mobile Athlon64, etc, all same "logic core").

So a development in the logic core of AMD's chips will eventually show up in all market segments (32bit -> 64bit for example) since it basically is the only core they have to sell. Intel can isolate their developments (maybe not good for customers but is good for product differentiation and thus INTC earnings) to a particular core.

Different business models to support different efficiencies but both hopefully driving technology forward ever faster.
 

Snoop

Golden Member
Oct 11, 1999
1,424
0
76
In a straight port of code highly optimized for x86-32, Counter-Strike dedicated server tests with both 32- and 64-bit versions revealed a 30% clock-for-clock gain, and is expected to show further performance gains in future upgrades
Ripped from: From Valve, makers of Half Life

If true (no reason why it would not be), is this 30% gain properly explained by the Anandtech x86-64 faq's, which indicates their will be minimal increase in perfomance of most applications?
 

Snoop

Golden Member
Oct 11, 1999
1,424
0
76
Originally posted by: paralazarguer
Yeah, that's a server though. Counterstrike DEDICATED SERVER!~
That has no bearing on my point. This is a common application, which will benefit many people. I run several servers from my house on my Cable connection, a 30% increase in clock for clock performance means I could theoretically use a 1400 mhz chip instead of a 2000, and not suffer any drop in performance. Also, will this equate to gains for other game servers, such as Doom 3, UT2k3, etc, etc.
What I am getting at is that in the Anandtech X86-64 FAQ, I see no mention of performance gains close to 30% on software as common as Half Life , not to mention that by the time the Opteron is released, the code will most likely be refined further.
Possibly I am misreading the FAQ, please enlighten me :D
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
What I am getting at is that in the Anandtech X86-64 FAQ, I see NO mention of performance gains close to 30% on such a common software, not to mention that by the time the Opteron is released, the code will most likely be refined further.
The performance gain just isn't due to 64bitiness. It's due to either the better architechture of the opteron, more registers, or lower memory latency. Look, the thing is, 64bit just means being able to address 2^64 worth of memory*, being to do 64bit arithmatic in one pass, and having 64bit registers. None other these things would make counterstrike any faster.


*althought the opteron can only handle like 2^48 bytes of memory or something.
 

paralazarguer

Banned
Jun 22, 2002
1,887
0
0
actually, I believe that they were running both the 32 bit and 64 bit counterstike on the same hammer setup.
 

Snoop

Golden Member
Oct 11, 1999
1,424
0
76
Originally posted by: zephyrprime
What I am getting at is that in the Anandtech X86-64 FAQ, I see NO mention of performance gains close to 30% on such a common software, not to mention that by the time the Opteron is released, the code will most likely be refined further.
The performance gain just isn't due to 64bitiness. It's due to either the better architechture of the opteron, more registers, or lower memory latency. Look, the thing is, 64bit just means being able to address 2^64 worth of memory*, being to do 64bit arithmatic in one pass, and having 64bit registers. None other these things would make counterstrike any faster.


*althought the opteron can only handle like 2^48 bytes of memory or something.
I believe the quote was saying that a 64 bit version yielded a 30% increase in performance over 32 bit version on an opteron running at x mhz. It is possible this is due to limitations in the way the opteron runs 32 bit code (i am guessing), but it is interesting and deserves further analysis.

 

OddTSi

Senior member
Feb 14, 2003
371
0
0
Originally posted by: Snoop
Originally posted by: paralazarguer
Yeah, that's a server though. Counterstrike DEDICATED SERVER!~
That has no bearing on my point. This is a common application, which will benefit many people.

It has PLENTY of bearing on your point. That is a SERVER. No one is denying that for server use going from 32-bit to 64-bit usually will net a decent gain. What everyone is talking about is everyday computer use that 95% of people will do. For that 95% of people 64-bit computers won't even come close to being needed in the next couple of years. Hell for that 95% of people even a 2GHz P4 with 512mb of RAM is more computer than they'll need for the next 3 or 4 years.
 

Snoop

Golden Member
Oct 11, 1999
1,424
0
76
Originally posted by: OddTSi
Originally posted by: Snoop
Originally posted by: paralazarguer
Yeah, that's a server though. Counterstrike DEDICATED SERVER!~
That has no bearing on my point. This is a common application, which will benefit many people.

It has PLENTY of bearing on your point. That is a SERVER. No one is denying that for server use going from 32-bit to 64-bit usually will net a decent gain. What everyone is talking about is everyday computer use that 95% of people will do. For that 95% of people 64-bit computers won't even come close to being needed in the next couple of years. Hell for that 95% of people even a 2GHz P4 with 512mb of RAM is more computer than they'll need for the next 3 or 4 years.
rolleye.gif
rolleye.gif
rolleye.gif
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: OddTSi
Originally posted by: Snoop
Originally posted by: paralazarguer
Yeah, that's a server though. Counterstrike DEDICATED SERVER!~
That has no bearing on my point. This is a common application, which will benefit many people.

It has PLENTY of bearing on your point. That is a SERVER. No one is denying that for server use going from 32-bit to 64-bit usually will net a decent gain. What everyone is talking about is everyday computer use that 95% of people will do. For that 95% of people 64-bit computers won't even come close to being needed in the next couple of years. Hell for that 95% of people even a 2GHz P4 with 512mb of RAM is more computer than they'll need for the next 3 or 4 years.

What data do you have that supports such a claim? I am aware of no studies culminating data regarding the computing needs of 95% of the populace.
 

grant2

Golden Member
May 23, 2001
1,165
23
81
Originally posted by: Sohcan
I was merely trying to dispel the myth that microprocessors somehow "operate" on some fixed-sized dataset, and that 64-bit microprocessors can somehow "churn" through the data at twice the rate, and that this will somehow lead to a direct speedup in desktop applications.

This "myth of twice the performance" has so far only been brought up by 64bit skeptics. Certainly no one in *THIS* thread claimed anything more than SOME performance increase from 32-64 bit. Yet, as usual, all sorts of naysayers are compelled to crawl out of the woodwork harping on about "diminishing returns" and "limited use of 64bit data" and "if the FAQ says so, it *MUST* be true!!"

Still no one has answered my question "what did that 16->32 have that 32->64 doesn't?" ... the closest I hear is "64 bit variables aren't used very much" ... The simple retort is "maybe because there was a price to pay for using them." Now that 64bit desktop is imminent, maybe we should take idontcare's advice and open our minds new paradigms.


wasn't trying to downplay anything.

I didn't say you TRIED to downplay anything.

But, your faq doesn't at all discuss strings. Yet other articles I've read make the reasonable assumption that there can be HUGE performance gains by handling strings in 8-character variables (64 bit) rather than 4-character (32 bit).

Maybe it's a crock, or maybe it's an idea you didn't consider. Frankly, it's hard to imagine NOT getting a performance increase if 2x as much string can be shuffled each operation.
 

grant2

Golden Member
May 23, 2001
1,165
23
81
Originally posted by: Idontcare

What data do you have that supports such a claim? I am aware of no studies culminating data regarding the computing needs of 95% of the populace.

Heh don't get unreasonable by demanding actual proof of people's claims now :)
 

imgod2u

Senior member
Sep 16, 2000
993
0
0
Originally posted by: grant2
Originally posted by: Sohcan
I was merely trying to dispel the myth that microprocessors somehow "operate" on some fixed-sized dataset, and that 64-bit microprocessors can somehow "churn" through the data at twice the rate, and that this will somehow lead to a direct speedup in desktop applications.

This "myth of twice the performance" has so far only been brought up by 64bit skeptics. Certainly no one in *THIS* thread claimed anything more than SOME performance increase from 32-64 bit.

There shouldn't even be "some" increase in most of what consumers do. There is the increase in Integer and Memory address space but that's about it. Most of today's consumer-level applications are moving towards FP-intensive (games, media programs, etc.) All that's left that still use integer-intensive calculations are Office applications and maybe antivirus software. For those of us who have programmed some of these things. When's the last time you needed to use anything beyond a short (16-bit integer)? I certainly rarely ever do, but I use int simply because it's the standard integer. Most of the time, the numbers I deal with are never beyond 2000 or so.

Yet, as usual, all sorts of naysayers are compelled to crawl out of the woodwork harping on about "diminishing returns" and "limited use of 64bit data" and "if the FAQ says so, it *MUST* be true!!"

The FAQ explained quite fully the reason why it has limited use. Not merely that it does. Did you actually read it?

Still no one has answered my question "what did that 16->32 have that 32->64 doesn't?" ... the closest I hear is "64 bit variables aren't used very much" ... The simple retort is "maybe because there was a price to pay for using them." Now that 64bit desktop is imminent, maybe we should take idontcare's advice and open our minds new paradigms.

I answered this above. The growth from 16-bit computing to 32-bit was a huge leap, it'll be quite some time before another one would be needed in the desktop space.

wasn't trying to downplay anything.

I didn't say you TRIED to downplay anything.

But, your faq doesn't at all discuss strings. Yet other articles I've read make the reasonable assumption that there can be HUGE performance gains by handling strings in 8-character variables (64 bit) rather than 4-character (32 bit).

Strings, in most languages, are implemented as an array of characters. An ASCII character, as I recall, is 8-bits in size. There's really no need for bigger size characters as you only have a certain number of characters (the alphabet, numbers, some other characters) and Strings are arrays which contain pointers that point to memory address spaces (and we're back to the memory addressing part again). You know of any Strings in any program that could remotely reach 4 GB in size?

Maybe it's a crock, or maybe it's an idea you didn't consider. Frankly, it's hard to imagine NOT getting a performance increase if 2x as much string can be shuffled each operation.

Considering Strings are not a primitive data type, it's not being "shuffled around" at all. The individual characters, which are in and of themselves 8-bit integers, are being shuffled around along with the pointers to each character's position, which is in and of themselves memory addresses. And currently, 32-bits of memory address will give you up to a 4 GB String. No decent programmer would ever allow any String he uses to get up to 4 GB. Word can't even bloat that much.

Of course there are possible places in which larger numbers would help. Scientific calculators for one. But realistically speaking, from someone who's actually had experience with writing user-level programs, larger integers are definitely not needed.
 

MadRat

Lifer
Oct 14, 1999
11,999
307
126
The latency between memory and the controller and from controller to the core is lower in the Hammer. This could be where they gain in the server.