AMD "64 Bit Athlon" a Gimmick?

Hemsky

Member
Feb 8, 2007
59
0
0
I work in computer sales, and I recently had a customer ask what the advantages of going with the 64 bit processors were. I was stumped as to what to tell him.
I mean, when I bought my 3800+ I thought i was safe in the future for when the 64-bit architecture hit the market. It never did become mainstream. That was 2 years ago.
I mean sure I think farcry offered a 64 bit version of the game, and windows XP had the 64 bit version but it came with a host of problems. Even Vista comes standard in 32 bit.

I'm sure AMD's intentions were pure, and it was not necessarily a ploy on their part to sell more chips but the A64 line is quickly disappearing without it even utilizing it's main feature!

I'm talking purely from a average consumers point of view. Hell I doubt many enthusiasts even use the 64 bit.

Anyone have any insight into this topic? Was it the first mistake AMD made? Presumptuous and perhaps overconfident in the market? Their market?
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Remind me, which viral marketing company do you work for? My memory isn't as good as it used to be, so I can't remember.
 
Feb 19, 2001
20,155
23
81
Uhh, AMD wanted a way to jump into 64-bit to hit hard into Intel's Itanium. While IA-64 was native 64-bit and Intel surely had methods to throw in 64-bit extensions into its P4 and Xeon processors, AMD wanted something to strike the server market fast, and that was the Hammer architecture. Sun convinced AMD to go with 64-bit extensions instead of trying a whole native 64-bit scheme which failed for Intel miserably. AS a result, here we have it.... 64-bit support is still iffy, and we won't have much for a while
 

Sureshot324

Diamond Member
Feb 4, 2003
3,370
0
71
With the release of Vista, a LOT of people (myself included) are switching to a 64 bit operating system. It is finally going mainstream, which means software developers will finally start writing 64 bit software.

Making a 64 bit processor was definitely not a mistake on AMD's part. It gave them a huge marketing advantage and helped them sell a lot of processors, even though most of these early 64 bit AMD processors would not run 64 bit software for a long time, if at all. I was on my second 64 bit CPU when I upgraded to Vista, so my first one only ran 32 bit.

If I'm not mistaken, AMD64 was the first processor that could run both 64 bit and 32 bit software natively (Itanium had to emulate 32 bit software). This meant that MS built WIndows 64 around AMD's 64 bit standard, and intel had to license that standard from AND, instead of the way around. It helped establish AMD as a technology leader, instead of a company that just makes cheap knockoffs of intel processors.
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
Originally posted by: Sureshot324

If I'm not mistaken, AMD64 was the first processor that could run both 64 bit and 32 bit software natively (Itanium had to emulate 32 bit software). This meant that MS built WIndows 64 around AMD's 64 bit standard, and intel had to license that standard from AND, instead of the way around. It helped establish AMD as a technology leader, instead of a company that just makes cheap knockoffs of intel processors.

Yes the AMD64 was revolutionary for this reason. Hardly a gimmick.

Ever hear the expression "If you build it, they will come"? Well if there are no desktop 64bit chips, why would software companies develop 64 bit operating systems and applications? :p

It's a big transition and this takes time.

 

mechBgon

Super Moderator<br>Elite Member
Oct 31, 1999
30,699
1
0
The same AMD64 technology that might've seemed pointless in the consumer CPU at the time was nevertheless well-received in other areas. And the consumer CPU, even in 32-bit stuff, was a top-flight performer, so it's not as if its 64-bit capability hurt it by not being utilized.

Also, AMD had to make an architecture that would go the distance. It's still going the distance today, now that 64-bit is beginning to go mainstream, and it will keep going the distance until their next-gen architecture comes out. AMD made the right choice insofar as going with a 64-bit design, that's for sure. Maybe it got overhyped in marketing, but as I said above, it's not as if people didn't get their money's worth in 32-bit performance.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: MS Dawn
Yes the AMD64 was revolutionary for this reason. Hardly a gimmick.

Ever hear the expression "If you build it, they will come"? Well if there are no desktop 64bit chips, why would software companies develop 64 bit operating systems and applications? :p

It's a big transition and this takes time.
You're right, except for one thing: AMD didn't start building 64-bit capable chips for the desktop market. They did it to take over the server world, which they did. At the time that the Opteron was introduced, nearly every server in the world had a Pentium 4 in it, because not only were Itaniums extremely expensive, they were also not very user friendly, even for sys admins and people who write code for a living.

The fact the Athlon 64's were very fast with 32-bit code is why nearly everyone who bought an A64 bought it, from what I could tell. The fact that if the software (especially OS) market ever did catch up, they wouldn't be "left behind" was really just an added bonus to almost everyone who bought one for home use, it seemed to me.
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
Originally posted by: myocardia
You're right, except for one thing: AMD didn't start building 64-bit capable chips for the desktop market. They did it to take over the server world, which they did. At the time that the Opteron was introduced, nearly every server in the world had a Pentium 4 in it, because not only were Itaniums extremely expensive, they were also not very user friendly, even for sys admins and people who write code for a living.

The fact the Athlon 64's were very fast with 32-bit code is why nearly everyone who bought an A64 bought it, from what I could tell. The fact that if the software (especially OS) market ever did catch up, they wouldn't be "left behind" was really just an added bonus to almost everyone who bought one for home use, it seemed to me.

Take over the server world? Perhaps in their dreams. :p

Every server room I see is dominated by Intel silicon.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: MS Dawn
Originally posted by: myocardia
You're right, except for one thing: AMD didn't start building 64-bit capable chips for the desktop market. They did it to take over the server world, which they did. At the time that the Opteron was introduced, nearly every server in the world had a Pentium 4 in it, because not only were Itaniums extremely expensive, they were also not very user friendly, even for sys admins and people who write code for a living.

The fact the Athlon 64's were very fast with 32-bit code is why nearly everyone who bought an A64 bought it, from what I could tell. The fact that if the software (especially OS) market ever did catch up, they wouldn't be "left behind" was really just an added bonus to almost everyone who bought one for home use, it seemed to me.

Take over the server world? Perhaps in their dreams. :p

Every server room I see is dominated by Intel silicon.
Well, perhaps that was too strong of a term. How about go from a .05% server market share to a what, 35% server market share. They definitely turned the server world upside down. At that time, even the companies who were using Linux/Unix were running all 32-bit. How many server farms do you know of these days that run 32-bit?:D
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
Originally posted by: myocardia

Well, perhaps that was too strong of a term. How about go from a .05% server market share to a what, 35% server market share. They definitely turned the server world upside down. At that time, even the companies who were using Linux/Unix were running all 32-bit. How many server farms do you know of these days that run 32-bit?:D

I honestly have no idea but here's something even funnier. All the rooms wired to handle netburst will be able to handle twice the amount of servers using more efficient silicon. :D
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: MS Dawn
Originally posted by: myocardia

Well, perhaps that was too strong of a term. How about go from a .05% server market share to a what, 35% server market share. They definitely turned the server world upside down. At that time, even the companies who were using Linux/Unix were running all 32-bit. How many server farms do you know of these days that run 32-bit?:D

I honestly have no idea but here's something even funnier. All the rooms wired to handle netburst will be able to handle twice the amount of servers using more efficient silicon. :D
Would you have said that before August 21, 2006?;)
 

Chaotic42

Lifer
Jun 15, 2001
34,834
2,010
126
The main benefit that's going to affect most people is the support for addressing more than 4GB of memory.
 

Shaker8

Member
Jan 6, 2006
57
0
0
I can't Remember exactly but wasn't there something about it allowing for more Virtual Memory as well which was supposed to help people with programs like Photoshop and others that used alot of Virtual Memory?

EDIT: Found it

Larger virtual address space: Current processor models implementing the AMD64 architecture can address up to 256 tebibytes of virtual address space (248 bytes). This limit can be raised in future implementations to 16 exbibytes (264 bytes). This is compared to just 4 gibibytes for 32-bit x86. This means that very large files can be operated on by mapping the entire file into the process' address space (which is generally faster than working with file read/write calls), rather than having to map regions of the file into and out of the address space.

From Wikipedia, the free encyclopedia
x86-64

The reason I think thats important and I am probably wrong but if AMD64 never came out then using a x86 processor on vista would mean you could not address more than 2GB of vitrual memory at any time. Is that correct? Which would somewhat negate the purpose of having any more than 2gb of memory?

Just remember hearing that said alot back before claw hammer actually hit the market
 

jackschmittusa

Diamond Member
Apr 16, 2003
5,972
1
0
This is one of those cases where the hardware had to come first, and installed base of the older tech became the speedbump for implementation of faster, more capable software.

The same thing happened to cd-rom drives. They didn't become popular till they got to 4X, which then became a plateau for quite a while (in tech terms). The installed base became so large, that until they reached about 24X speed, virtually nobody but the companies that made burning apps wrote anything that actually utilized the drives at more than 4X because they wanted to appeal to a larger market. (A lot of apps relied on the cd for storage at the time because hds were small and expensive. The cd had to be in the drive for many apps to use all of their functions and content.)

The gap between 32bit and large scale adaptation of 64bit software was always going to be there, so the sooner we started the journey, the sooner we got there. AMD did a good thing for us by getting the ball rolling.
 

Roguestar

Diamond Member
Aug 29, 2006
6,045
0
0
My my, FUD FUD FUD.

The best thing about the Athlon 64 for most people was not that they were 64-bit, but that they were the best processors at the time. 64-bit, for the average joe consumer, was a bonus and an open end to future proofing.

Needless to say the server market was eaten up in large mouthfuls by AMD with the ability to natively execute 64 and 32-bit instructions.

And don't lets just go on about XP-64 being a mess (even if it was), because not only does Vista help but windows isn't the only desktop OS out there. A few friends of mine run unix/linux 64-bit OSs and find the extra power and ability to address RAM a godsend.
 

Roguestar

Diamond Member
Aug 29, 2006
6,045
0
0
Originally posted by: Hemsky
the A64 line is quickly disappearing without it even utilizing it's main feature!

Hate to point it out, but the X2 series are dual-core Athlon 64 chips, so the only reason the vanilla A64 is going is because it's being built on and bettered.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: SuperFreaky
realworld: it allows you to run Win64 and use >3.2GB of RAM....

Some apps may also see performance improvements when recompiled for 64-bit due to the increased number of general-purpose registers (16 versus 8 for 32-bit x86). 64-bit SSE also has additional registers, which helps programs that were written with those extensions.

Programs that deal with vast amounts of data (such as photo editing, and video editing and encoding/decoding) may be more efficient if they can natively work on 64-bit chunks of data rather than 32-bit ones. However, x86 extensions such as SSE already allowed for significantly more than 32 bits of data to be manipulated at once (albeit with more limitations).

But if you don't have more than 2GB of RAM, and you aren't running computationally-intensive software (or CPU-limited games) that has a 64-bit version available -- going to 64-bit buys you very little IMO. Having two cores is more of a benefit to real-world OS performance in most cases.

It's nice to be future-proof, but 64-bit-only software will be optional for a long while in the desktop world. Writing 16-bit code is painful, since many programs need access to more than 64K of memory, and 32-bit programming offered large and obvious benefits for all programmers. Once 32-bit processors were available, the 16-bit ones were quickly abandoned. But few programs need access to more than 2GB of data at once, and very few desktop users even have that much physical RAM installed currently.
 

Sureshot324

Diamond Member
Feb 4, 2003
3,370
0
71
Originally posted by: archcommus
Hell I still don't know if I want 32 or 64-bit Vista this summer.

Go with 64bit. It runs pretty much all 32bit software just fine with no slowdown. That way you'll be more future proof.
 

Zap

Elite Member
Oct 13, 1999
22,377
7
81
Originally posted by: fierydemise
How long did the switch from 16 to 32 bit take?

A number of years. IIRC systems with 286 CPUs were still being advertised in Computer Shopper alongside 386 and 486 systems.