Intel exec sees 64-bit irrelevant for Home PCs now

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sahakiel

Golden Member
Oct 19, 2001
1,746
0
86
Originally posted by: batmanuel
Originally posted by: RobK
the itanium is a pretty big success these days. you guys need to get out from under your respective rocks.

Maybe if you are building a supercomputer. Even then, you'd probably be better off getting a thousand or so Apple Xserve G5s and clustering them.

Each supercomputer has its strengths and weaknesses. If you have the money to build a supercomputer, you have a specific application in mind and will build accordingly.
That said, VT's cluster is ranked so high based on flops. If you know anything about MIPS, you know how relatively useless the top500 list is.
 

cm123

Senior member
Jul 3, 2003
489
2
76
Come-on guys, just conform to whatever it is Intel says... When need be, 64 bit is irrelevant, others times, well, 64 rocks the Intel way, then, if Intel says we need hot slow next gen. cpu, then face it, we do, right? AMD knows nothing and Intel all we need, right?. Intel will tell us all what we need and when, after all, nobody could like AMD, right?

As we pointed out earlier this week in Omid's column, this 64 bit business is nothing but a phony war.

http://www20.tomshardware.com/column/20040220/index-02.html

I think there is much gaming playing left to come on this subject yet. Not even everyone at Intel has the pointers pointed all in the same direction...
 

sonoran

Member
May 9, 2002
174
0
0
Originally posted by: pspada
Why the heck you think Intel is talking about adding "64-bit extentions" into future Pentium chips?
To ensure that their competition doesn't take additional market share when people do start to need/want the capability? Duh.

As for the relevance of 64 bits - a quick review of your "top ten rigs" reveals that your main machine has 1.5GB RAM. Every other machine is 256MB or less, many just 128MB. So how is 64 bits relevant to you - right now? You don't even have one machine maxed out yet. And I'm guessing you would consider yourself significantly ahead of the average home user? The majority of home machines being sold today have 256MB-512MB RAM. They would be hard pressed to see any benefit from a 64 bit CPU. (The extra registers in x86-64 would be beneficial, but extra registers could be added to x86-32 - and neither Intel nor AMD has done that. THAT is a completely separate argument from "64-bit irrelevant".)

Ever heard of the term "hype cycle?" If not, give this a read: Hype Cycle. I might be so bold as to suggest that for the home user, 64 bit computing is in the "Trough of disillusionment" right now - no Windows support, lack of device drivers, mainboards that can't begin to support more than 4GB RAM, excessive cost of 4GB+ of RAM even if it did work - the list goes on. We won't always be there, but that's where we are today.

The position that 64 bits is not needed by the vast majority of home users today is based on the facts. Even so, I do wish they'd can that line...

*** Obviously not speaking for Intel Corp ***
 

pspada

Platinum Member
Dec 23, 2002
2,503
0
0
So then, Intel was buying into the "hype cycle" when they created their I and I-2 chips? No, they were trying to break into the serious server market - and basically failed, whereas AMD has done just that. while 64 bit chips are not the staple of home users now, there can be no doubt that it is the future of all but embedded computing. My next major upgrade will be to the A64 platform - and I don't care if my current software takes advantage of the 64 bits now. It will soon enough.

And I have proof, which is Intel adding 64 bit extentions to their chips that are fullying command compatible with AMD 64 bit processors.l
 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0
Originally posted by: pspada
So then, Intel was buying into the "hype cycle" when they created their I and I-2 chips? No, they were trying to break into the serious server market - and basically failed, whereas AMD has done just that. while 64 bit chips are not the staple of home users now, there can be no doubt that it is the future of all but embedded computing. My next major upgrade will be to the A64 platform - and I don't care if my current software takes advantage of the 64 bits now. It will soon enough.

And I have proof, which is Intel adding 64 bit extentions to their chips that are fullying command compatible with AMD 64 bit processors.l
I'm not even sure you read the article that sonoran linked...

At any rate, there wasn't the Hype Cycle associated with Itanium because 64-bit on the Enterprise level is an established feature. Sure, EPIC is new, but the need and benefit of 64-bit computing has already arrived at the high end server marketplace.

Btw, I have no idea how you figure that AMD has succeeded in the high end sever market wheras Intel has "failed".

64-bit computing is not really relevant for the desktop yet because the desire for >4gb of addressable RAM is non-existant. Do you even realize that 6gb of CL3 PC3200 is ~$3000 by itself???

I have little doubt that someday 64-bit will be very much desired on the desktop. But for now, it is a perceived benefit, and not a reality.


All that being said, I'm not saying that Hammer is not an impressive performer. I'm just saying that it's not because of "64-bit."





 

videoclone

Golden Member
Jun 5, 2003
1,465
0
0
sonoran!! you should really read up about 64bit .... because your information on it is surrounding 1 aspect /// the memory addressing over 4GB?s .. in reality adding more memory to your system above 4Gb?s doesn?t really improve anything? BUT if you knew something else about 64bit you will understand that the 64bit extension adds more then just registers and larger memory addressing it adds a larger data path just like in videocards where for example an
8x1 pipeline ATI 9800 GPU processes twice as much data then a 4x1 Pipeline ATI 9600

This wider path is already showing large improvement in Divxing and movie encoding it will also improve once software is written in 64bit ?.. I suggest you stop talking bullsheet and put that 4GB?s of ram up your !#)^#$@:?SS
 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0
Originally posted by: videoclone
sonoran!! you should really read up about 64bit .... because your information on it is surrounding 1 aspect /// the memory addressing over 4GB?s .. in reality adding more memory to your system above 4Gb?s doesn?t really improve anything? BUT if you knew something else about 64bit you will understand that the 64bit extension adds more then just registers and larger memory addressing it adds a larger data path just like in videocards where for example an
8x1 pipeline ATI 9800 GPU processes twice as much data then a 4x1 Pipeline ATI 9600
Wrong. This isn't a 64-bit databus... It's 64-bit processing. Basically it means that calculations can be more precise because it uses a larger range of integers. The ability to address greater than 4gb of RAM really is the highlight of 64-bit on the desktop. And you are correct... At this point, there really isn't any benefit to that on the desktop.

Check out the AnandTech FAQ, The myths and realities of 64-bit computing... There is a lot of good info there on just how 64-bit works.
Originally posted by: videoclone
This wider path is already showing large improvement in Divxing and movie encoding it will also improve once software is written in 64bit ?..
I can only presume you are basing this on a few numbers reported on the internet. What you need to ask yourself is whether this is a result of more precise calculations, or due to the added registers and optimizations for AMD's processor.
Originally posted by: videoclone
I suggest you stop talking bullsheet and put that 4GB?s of ram up your !#)^#$@:?SS
Holy cow! Why so hostile? Why are you taking it so personally? Why is it unacceptable to you that someone else has a differing opinion?
I've read your posts and I understand that you are a huge AMD fan... Why some people get so emotionally committed to a corporation, I'll never understand. Nor do I understand why someone would take such a zealous stance and limit their choices.

At any rate... You might want to keep more of an open mind. There is a lot to learn about this stuff, and a lot of good info can be had from this site and these msg boards. :)

 

videoclone

Golden Member
Jun 5, 2003
1,465
0
0
Wingznut its not that i'm a fan of AMD its just i hate intel..... they try and tell me what i want .. they tell me i dont need 64bit they tell me i dont want extra cache in my CPU, BUT I DO AND I NEED and to me fast is never fast enough. The only reason intel released 800FSB Pentium 4?s with Hyper threading is because AMD took the first step up and introduced the Barton core the only reason intel is now going to give us 64bit CPU?s and even more features otherwise reserved for ?servers? is because AMD took that first step yet again with the AMD64 and intel is forced to to follow? if AMD didn?t take all these first steps then intel would sit on its thrown of dictation and shower us with celerons ( all the while telling us that this is what we need ) and only the corporate office would use Pentiums just like back in the old days where Intel was the only feasible choice we all had Celerons do you remember those days ??? ( 1998-1999 ) mind you those days ended as soon as AMD come in with the K7 Athlon.

Either way intel has always followed this trend and it still hasn?t changed.... The day intel starts giving us new things not because its gotta retake the performance crown but because its improving on something that we want is the time i will start liking intel.

PS: Ive been into reading tech info have been doing so over the past 8 years now and ive read up everything about the history of CPU's and future CPU's ... " that being " CPU road maps " .. ( Ohh and The Intel CPU road map changes and improves every time AMD Wakes them up with a better CPU ... the AMD road maps stays pretty much intact because they actually want to give us better CPU's Unlike Intel.
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
Videoclown.....I know you hate INtel as it resinates in all your threads...However go and answer the fact you jumped up someones arse and it seems it is you who knows little about 64bit architecture or at least AMD's 64bit architecture.....


YOu want to be mad at someone??? Be made at the software developers. They are the ones falling behind. This mhz race is like the cold war. Each is racing the speed up and for the majority of average pc users (and realize you and I are not the true average users) we have much more power then we need to have for a majority of our uses. I could encode a Divx movie with my 750 tbird, just took longer. This race of speed is giving us an illusion of performance. The fact is direct x9.0 has been out for quite awhile and how many games that are actually out currently implement it??? The fact is HT has now been out for almost a year and why are most programs not multithreaded?? They could be but developrs are way behind in my mind. I see simple program updates pulling leaps in performance without even upgrading the PC.....

The fact is we are almost a full ghz more then we need now for gaming and most things you do.....If AMD wasn't pushing the envelop, the market would still dictate what is needed. The digital world has fueled some of this large jump in cpu speed of the recent years so even without AMD pushing it I firmly believe we would be in the mid 2ghz range.



As for the claim HT was an answer to the Barton...What are you smoking???? The 533fsb was competing just fine and the 3.06ghz 533fsb chip had HT capability and back then there were few if any HT enabled apps and the test rarely showed any of them or multitasking....A 3.06ghz 533fsb iw/ HT s was very equal to the then 3200+ Barton. INtel was competitive without the p4c...It was the p4c that started embarassing the Barton line and showing the weaknesses of it and the PR rating....A 2.8c in most comparison is equal to a 3200+ Barton.....If anything the P4c's were preemptive strike before the AMD64's without really knowing what speed they would hit the market at.Except for gaming the p4cs are still competitive and actually better in several areas then the A64's...So P4c was not answering the Barton it was considerably pushing past it....
 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0
videoclone... I realize that you aren't going to believe me, but I'll give it a shot anyway.

Intel goes where the market is. They are in this business to make money. And you don't do that by dictating what people get. You get that successful by giving people what they want.

And x86-64 is a perfect example. The fact of the matter is that 64-bit isn't going to bring much of anything to the desktop. However, when AMD was the first to bring it to market, they became the innovator. Intel stated that 64-bit is pretty irrelevant on the desktop at this time. But because there is public demand for x86-64, Intel is going to deliver earlier than anticipated. (Although it'll only be available for the workstation and low-end server environments for now.)

RDRAM is another example. RDRAM is actually a better technology than DDR. But it became apparent that DDR is what the masses wanted, not RDRAM. So, Intel conformed to what the public wanted.

Now let me address a couple of your points...
they tell me i dont need 64bit they tell me i dont want extra cache in my CPU, BUT I DO AND I NEED and to me fast is never fast enough.
Isn't it just possible that Intel is correct about 64-bit? Read more about 64-bit (starting with the FAQ that I linked), and keep an open mind.
As for cache, I'm not sure what you are getting at... They have both 1mb and 2mb on-die cache cpu's, they delivered 512k cache before AMD, etc.
The only reason intel released 800FSB Pentium 4?s with Hyper threading is because AMD took the first step up and introduced the Barton core...
This is so very wrong... It takes YEARS to develop, validate, and manufacture a cpu. These things are planned out long before the competition releases a product.
if AMD didn?t take all these first steps then intel would sit on its thrown of dictation and shower us with celerons ( all the while telling us that this is what we need ) and only the corporate office would use Pentiums just like back in the old days where Intel was the only feasible choice we all had Celerons do you remember those days ??? ( 1998-1999 ) mind you those days ended as soon as AMD come in with the K7 Athlon.
Celerons are available because the market dictates it. The consumers want a lower cost option.
I'm presuming you are referring to the prices. Well, if you go back to when the 1ghz K7 was introduced, you'd see that it cost $1299. (Sorry, ElFenix. ;) ) The competition is not what caused prices to drop... It's the market. Things were never more competitive than the race to 1ghz, yet prices were still at an all time high. But the demand has changed. If both Intel and AMD could sell as many processors for $1300 today as they did in 1999, there is no doubt they would. Basic economics at work, here. Do you really believe AMD is happy losing money quarter after quarter (until Q1 '04)? Of course not... They'd love to be able to charge $1000 per cpu, and they will if the market allows.
Take note that as the economy gets better, the price of the top bin products is getting higher and higher.
Ive been into reading tech infor and been doing so for the past 8 years now online and ive read everything about the history of CPU's and future ...
I understand that. But the fact of the matter is that what you read on the internet isn't always fact, nor is it always accurate. It wasn't until I got in the semiconductor industry did I truly realize how much I THOUGHT I KNEW from all the years I'd spent as an enthusiast.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: WobbleWobble
Originally posted by: batmanuel
"No one will ever need more than 640k of memory"-Bill Gates,1985

That quote is just an urban legend.

No... that quote is taken out of context. Bill was speaking of a specific use and a specific platform. (as I remember)

*EDIT* Also may I ad that it's important for hardware to advance ahead of software. If it doesn't, you run into the situation of not being able to do what you want, or having to do it very slowly. Do I need the dynamic range of a 64-bit processor to kill my friend down the street in Unreal Tournament 2003? Of course not... will I need the dynamic range of a 64-bit processor to kill my friend down the street in Unreal Tournament 2006? Possibly... maybe not... but wouldn't it be nice to have the hardware already if it is necessary?

I remember using my old 386SX back in '91 or '92 and double clicking the MS Word icon... going to grab a glass of coke, grab a bag of chips, turn the TV on, then come back to the computer and sit for a few more seconds watching the buttons appear one by one on the toolbar. I definately do not want a repeat of that ever again.
 

Sahakiel

Golden Member
Oct 19, 2001
1,746
0
86
Originally posted by: Jeff7181
*EDIT* Also may I ad that it's important for hardware to advance ahead of software. If it doesn't, you run into the situation of not being able to do what you want, or having to do it very slowly. Do I need the dynamic range of a 64-bit processor to kill my friend down the street in Unreal Tournament 2003? Of course not... will I need the dynamic range of a 64-bit processor to kill my friend down the street in Unreal Tournament 2006? Possibly... maybe not... but wouldn't it be nice to have the hardware already if it is necessary?

I remember using my old 386SX back in '91 or '92 and double clicking the MS Word icon... going to grab a glass of coke, grab a bag of chips, turn the TV on, then come back to the computer and sit for a few more seconds watching the buttons appear one by one on the toolbar. I definately do not want a repeat of that ever again.
In that case, why bother with x86-64? Scrap the x86 ISA and go with something better suited. The only reason AMD developed 64-bit extensions for x86 in the first place was because they couldn't develop EPIC processors and didn't have the funding to develop an equivelent themselves. EPIC was announced in 1994, I believe, and the basic technology has been around since 1983 or earlier.


Oh, and if anyone thinks it's enough to read stuff off hardware review sites to learn about computer architecture, I'll have to agree with Wingznut (damnit) and say it's not enough. I was in the same boat and thought I knew quite a bit, but while that knowledge sure made general concepts easier to pick up, it's only a mere hint of how much information is available. Until you're stuck with 7 weeks to write a ten page technical paper (on the level of ISCA or above) on branch prediction latency, you have little appreciation for what is truly a foreboding subject.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: Sahakiel
Originally posted by: Jeff7181
*EDIT* Also may I ad that it's important for hardware to advance ahead of software. If it doesn't, you run into the situation of not being able to do what you want, or having to do it very slowly. Do I need the dynamic range of a 64-bit processor to kill my friend down the street in Unreal Tournament 2003? Of course not... will I need the dynamic range of a 64-bit processor to kill my friend down the street in Unreal Tournament 2006? Possibly... maybe not... but wouldn't it be nice to have the hardware already if it is necessary?

I remember using my old 386SX back in '91 or '92 and double clicking the MS Word icon... going to grab a glass of coke, grab a bag of chips, turn the TV on, then come back to the computer and sit for a few more seconds watching the buttons appear one by one on the toolbar. I definately do not want a repeat of that ever again.
In that case, why bother with x86-64? Scrap the x86 ISA and go with something better suited. The only reason AMD developed 64-bit extensions for x86 in the first place was because they couldn't develop EPIC processors and didn't have the funding to develop an equivelent themselves. EPIC was announced in 1994, I believe, and the basic technology has been around since 1983 or earlier.


Oh, and if anyone thinks it's enough to read stuff off hardware review sites to learn about computer architecture, I'll have to agree with Wingznut (damnit) and say it's not enough. I was in the same boat and thought I knew quite a bit, but while that knowledge sure made general concepts easier to pick up, it's only a mere hint of how much information is available. Until you're stuck with 7 weeks to write a ten page technical paper (on the level of ISCA or above) on branch prediction latency, you have little appreciation for what is truly a foreboding subject.

Should we bow down or something?

All I'm saying is, hardware has to advance ahead of software... if it's the other way around, you have pissed off users.
 

Sahakiel

Golden Member
Oct 19, 2001
1,746
0
86
To be honest, that second half was directed at a different audience.

All I'm saying is, hardware has to advance ahead of software... if it's the other way around, you have pissed off users.

On the other hand, if you have software develop ahead of hardware, you have pissed off users willing to plunk down money for upgrades.. ^^
What you're referring to has been termed "killer apps." Back in those days, there were software a large majority of people used which were very useful and could seriously benefit from faster hardware. These days, the only reason people are arguing over why hardware should outpace software is because there are no "killer apps" for the vast majority of users. It would be a nice time to finally kill x86.
 

hans007

Lifer
Feb 1, 2000
20,212
18
81
Originally posted by: Sahakiel
To be honest, that second half was directed at a different audience.

All I'm saying is, hardware has to advance ahead of software... if it's the other way around, you have pissed off users.

On the other hand, if you have software develop ahead of hardware, you have pissed off users willing to plunk down money for upgrades.. ^^
What you're referring to has been termed "killer apps." Back in those days, there were software a large majority of people used which were very useful and could seriously benefit from faster hardware. These days, the only reason people are arguing over why hardware should outpace software is because there are no "killer apps" for the vast majority of users. It would be a nice time to finally kill x86.


i seriously believe we are just in a lull because of lowered r&d spending due to the weak economy. i grew up with the computer, and spent most of my childhood reading about it.

I know in the old days, killer apps were coming out every other week. voice recognition, handwriting recognition etc, still arent finished. tablet pcs, come back every 3 years etc.

but yeah back in the day office suites were killer apps. then around the early to mid 90s there was a lull . then the internet happened.

the next killer app is always around the corner and you wont even know what it is until it has pervaded the very existance of computing.


back in the day, computers were always slower than the software that everyone wanted. computers have reached the mass market though now, but there is always an app for every field that requires more power. sound editors still have their killer apps, video has their apps , gamers have their killer apps and so on.


the pace of upgrading has also increased so the "slowdown" isnt there, we just upgrade. back in the day you bought a 386 and it was like buying an xbox it could last you 4 years.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: Sahakiel
To be honest, that second half was directed at a different audience.

All I'm saying is, hardware has to advance ahead of software... if it's the other way around, you have pissed off users.

On the other hand, if you have software develop ahead of hardware, you have pissed off users willing to plunk down money for upgrades.. ^^
What you're referring to has been termed "killer apps." Back in those days, there were software a large majority of people used which were very useful and could seriously benefit from faster hardware. These days, the only reason people are arguing over why hardware should outpace software is because there are no "killer apps" for the vast majority of users. It would be a nice time to finally kill x86.

To spark some more thoughts... could it be too late to kill x86? The vast majority of computers are x86 based... imagine the upgrades that would be necessary to complete the switch. Those who can't afford to upgrade are stuck with using old and most likely unsupported hardware and software, as switching over to a different ISA would most likely require a lot of developer time, right? So not much time could be spared to work on bugs or updates in current software.

IIRC, the reason everyone who has stuck with x86 has in fact done so, is because an x86 processor is cheaper to manufacture than say, a RISC processor, is it not? Or was that only true way back when the majority of home users were deciding what was better, a Mac or a PC?

Another thing... is there anything an x86 processor can do that a RISC processor or I dunno... an Itanium can't do? Or can't do as well? (with the exception of running 32-bit software on an Itanium)
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
back in the day you bought a 386 and it was like buying an xbox it could last you 4 years.
And that 386 could have cost you as much as $3000 easily and still wouldn't run all your applications at what we would consider satisfactory speed today. Now, the average person could buy a computer for $500 and not notice the difference between that computer and a $3000 computer.
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
We need to stop looking at ourselves all the time when we try to undersand this.....I built 5 AMD systems for family and friends in the last 6 months.....Each and every one of them I bought roughly a 2100+ Tbred and trust me this is overkill for their uses. I bouhgt one Barton 2500+ for a friend who was a gamer but we all now at these levels the game card is more important and I got him an ATI 9600XT....

The fact is those cpus (2100+) are serious behind today in current speed, cache, bus, no 64bit, etc. Yet these computers knowing their uses will be more then enough for them for 2-3 years before I will need to upgrade. I likely can just up their ram and HDD storage for awhile..

The majority of the software is wayyyyy behind current technology yet we talk about 4.5ghz next year....What killer apps do we have today. It is all a matter of just faster number crunching, faster encoding, faster rendering etc. We can run these programs now. We are just cut off a few minutes and seconds more each time.


I think the other hardware is starting to catch up with some good improvements...We will have DDRII, PCI express, Faster SATA HDD's, and better LCD technology...I think a lull in this arms race would be nice for them to get the other components of a PC up to snuff....
 

pspada

Platinum Member
Dec 23, 2002
2,503
0
0
<snip> and put that 4GB?s of ram up your !#)^#$@:?SS

4 Gb of ram? As long as I can keep it afterwards, you are welcome to ram it up my whatever. :gift:
 

Sahakiel

Golden Member
Oct 19, 2001
1,746
0
86
Originally posted by: Jeff7181

To spark some more thoughts... could it be too late to kill x86? The vast majority of computers are x86 based... imagine the upgrades that would be necessary to complete the switch. Those who can't afford to upgrade are stuck with using old and most likely unsupported hardware and software, as switching over to a different ISA would most likely require a lot of developer time, right? So not much time could be spared to work on bugs or updates in current software.
The same argument has been used for Macs. Granted, Macs comprise such a small market share, but the fact is software can be re-written for a new ISA and be finished within reasonable time limits. The only problem is forcing software companies to do so which would be entirely possible by simply dropping x86 support across the entire market. As for the bugs and updates, re-compiling for a different ISA actually helps expose bugs in code. Taking the time to port over to a new architecture would help iron out potential problems and cut back on required updates. Unless, of course, releasing software updates every month or so is a good thing.
The key here is the required developer time. It is the only reason x86 has lasted so long.

IIRC, the reason everyone who has stuck with x86 has in fact done so, is because an x86 processor is cheaper to manufacture than say, a RISC processor, is it not? Or was that only true way back when the majority of home users were deciding what was better, a Mac or a PC?
Actually, no an equivelent RISC processor is cheaper to manufacture. That was a primary focus of the RISC philosophy. If you want to take a look at the current x86 incarnations, I think it's safe to say that dropping x86 opcodes would go a long way in simplifying the front end. The rest of the processor probably wouldn't need much in the way of additional hardware if any.

Another thing... is there anything an x86 processor can do that a RISC processor or I dunno... an Itanium can't do? Or can't do as well? (with the exception of running 32-bit software on an Itanium)
About the only thing I can think of is code compaction. I would guess an x86 program would be somewhat smaller than, say, an EPIC version of the program. This is, of course, neglecting differences in OS and other hardware.
 

spanner

Senior member
Jun 11, 2001
464
0
0
who says all PC users only play games, surf the internet and use office. I can use all the processing power I can get and that includes 64bit for my reasearch but I am not going to pay extra for it. How many businesses do you know that buy Xeons/itaniums for all their programmers? All they get is box standard PCs. The same goes for university labs. Basically what I am saying is if AMD can give me 64bit for cheap then I will use it.
 

cm123

Senior member
Jul 3, 2003
489
2
76
Originally posted by: spanner
who says all PC users only play games, surf the internet and use office. I can use all the processing power I can get and that includes 64bit for my reasearch but I am not going to pay extra for it. How many businesses do you know that buy Xeons/itaniums for all their programmers? All they get is box standard PCs. The same goes for university labs. Basically what I am saying is if AMD can give me 64bit for cheap then I will use it.

I'm glad to see you willing to use AMD in the business place, however are all the heads of corp. ready to try AMD 64?

How many use AMD at home are willing to push/use AMD at the work place?

ooking past the fan boys for each company, AMD has a good chance right now to get some market share in the business place, as even Intel now has decided to use AMD 64 technology...
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: cm123
Originally posted by: spanner
who says all PC users only play games, surf the internet and use office. I can use all the processing power I can get and that includes 64bit for my reasearch but I am not going to pay extra for it. How many businesses do you know that buy Xeons/itaniums for all their programmers? All they get is box standard PCs. The same goes for university labs. Basically what I am saying is if AMD can give me 64bit for cheap then I will use it.

I'm glad to see you willing to use AMD in the business place, however are all the heads of corp. ready to try AMD 64?

How many use AMD at home are willing to push/use AMD at the work place?

ooking past the fan boys for each company, AMD has a good chance right now to get some market share in the business place, as even Intel now has decided to use AMD 64 technology...

I guess Intel is calling it IA-32e now
rolleye.gif