What do you want from AMD?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

soonerproud

Golden Member
Jun 30, 2007
1,874
0
0
Originally posted by: keysplayr2003
And lastly, what the heck is Nvidia going to do when all this comes to town? Of course they will have a customer base for their discrete graphics GPU's, but for how long? I know it will take years, but discrete graphics might actually go away, depending on how "Fusion-like" and "Larrabee-like" the market is. Nvidia needs to follow suit. Anyway, I'm not trying to go OT on ya, just thinking about the repercussions of AMD selling ATI in light of Intels roadmap, and also thinking about what Nvidia might do.

I suspect that Nvidia and Intel are going to sign a cross licensing agreement for their technologies. This would make perfect sense for both companies in that Intel will gain the technologies to create a competitive GPU and Nvidia will have the all the tech they need to create a x86 CPU. Nvidia does not have a choice but to create their own x86 cpu with fusion like technology if they are to survive the next 10 years. Intel will save possibly years of R&D (and billions of dollars) for creating competitive graphics to use in their version of fusion.

It is an interesting time we live in when it comes to processor technology.
 

heyheybooboo

Diamond Member
Jun 29, 2007
6,278
0
0
I want AMD to partner with IBM in malta on 32nm to 22nm.

Other than that . . . keep on truckin' and pay down that debt!
 

heyheybooboo

Diamond Member
Jun 29, 2007
6,278
0
0
Originally posted by: soonerproud

I suspect that Nvidia and Intel are going to sign a cross licensing agreement for their technologies. This would make perfect sense for both companies in that Intel will gain the technologies to create a competitive GPU and Nvidia will have the all the tech they need to create a x86 CPU. Nvidia does not have a choice but to create their own x86 cpu with fusion like technology if they are to survive the next 10 years. Intel will save possibly years of R&D (and billions of dollars) for creating competitive graphics to use in their version of fusion.

It is an interesting time we live in when it comes to processor technology.

This is off-topic but I'll bring it back in. Intel wants to continue to expand its desktop, mobile and enterprise *platforms*. One of the great benefits to AMD in the acquisition of ATI is the abilty to develop and expand their own platforms.

nVidia is stuck in what may become an unteneble position. Their greatest growth over the last few years has been in their chipset business. Intel does not want to sell nvidia platforms. Intel wants to develop and expand their own platforms, graphics technologies and processing units.

AMD/ATI upped the ante on Intel. Multi-core technology is advancing to 'cafeteria' CPUs where individual cores are committed to encrption, graphics, physics, math, etc. You will be able to specialize or 'personalize' your own techno;ogy within an AMD platform. Chipzilla will evolve and (hopefully) open their platforms in a similar fashion.

When combined with AMD's Torenza open socket designs (and Intel's own open architecture plans) nvidia has to evolve their designs into specialized processing.




 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Viditor
Originally posted by: Arkaign
Originally posted by: Viditor
Originally posted by: keysplayr2003
Originally posted by: SickBeast
Originally posted by: Killrose
What do I want from AMD? Basically some sort of performance parity.
It wasn't very long ago that an AMD/ATI combo would destroy anything else on the market. Now you put them together and get second-best performance on both accounts (and really there are only 2 major companies making each type of product anyway).

It's almost as though they don't care anymore. They miss deadline after deadline with products that don't compete well anyways. Their PR/marketing departments repeatedly lie.

Really AMD is a company with enormous potential. I was pretty excited when they purchased ATI and I thought that they would be a force to reckoned with. So much for that idea...

If AMD sold off ATI, they would be gone in two years. They "needed" to buy ATI. They may have learned that Intel was going into CPU/GPU/IMC all in one chip and needed to act. Or they may have been ahead of everyone else and anticipated this is where the market would go.
They can't sell their ATI division (if that's what it's called). Nehalem, let alone Larrabee, will have integrated graphics as per Intels IDF four core Nehalem preview.

AMD has a substantial amount of growing pains to deal with. They obviously need more time to iron things out.

And lastly, what the heck is Nvidia going to do when all this comes to town? Of course they will have a customer base for their discrete graphics GPU's, but for how long? I know it will take years, but discrete graphics might actually go away, depending on how "Fusion-like" and "Larrabee-like" the market is. Nvidia needs to follow suit. Anyway, I'm not trying to go OT on ya, just thinking about the repercussions of AMD selling ATI in light of Intels roadmap, and also thinking about what Nvidia might do.

QFAT (Quoted For Absolute Truth)
And here I was, despairing that nobody else was really understanding this very simple fact...thanks keys!
AMD didn't spend all of that money on ATI because they wanted to, they well knew what financial straits lay ahead of them and how hard it was going to hit them in the short term.
AMD bought ATI because the absolutely HAD to, or risk being permanantely buried in another 3-4 years.

Just as muti-core was the obvious "next thing" (as leakage claimed more and more efficiency from any and all advances), now CPU/GPU is the obvious "next thing" going forward...it enhances design at almost every level and decreases system cost and power.
Both Intel and AMD have known this and have been making acquisitions in order to get to a level of execution.

Your logic is ludicrous. AMD has made chipsets before, and AMD could have made their own video products. AMD could have bought a *much* less expensive graphics company. There were so many alternatives, and still are. *MUST* buy ATI? Please. CPU+GPU in one package is also overrated, when a decent video-integrated chipset is dirt-cheap anyway. GPU performance changes too quickly to think that a gpu/cpu combo will be relevant for more than 6-12 months at a time in the BEST scenario.

If AMD is still kicking in 12-18 months, I'd be surprised. It's looking like they will go the way of 3dfx/Cyrix.

Ahhh...the good old "let's use magic instead of science" argument.

1. Please list for me the number of experienced integrated graphics companies that "AMD could have bought"...you can't just wave a magic wand and have a graphics product, nor can you just hire Moe, Larry, and Curly and have them do it for you! There's a very good reason companies like ATI and Nvidia have 1000s of engineers on the payroll...

2. Developing the original chipsets cost AMD more than developing the CPUs because they had to hire a whole new division to do it. Their chipset division was still very, very small when they bought ATI, and it didn't develop anything like integrated chipsets (you DO know that there are many different kinds of chipsets, yes?).

3. CPU+GPU allows for a significant reduction in power usage on both mobile and server platforms (systems that don't require high end graphics), and once a new graphics ISA is developed (which is what the CTM project has been doing) it should allow for higher end graphics as well. Note that this graphics solution will look nothing like what is currently in use...they will probably be more of a GPU cluster on-die with direct access to "graphics-tagged" threads from the cache (this is a guess as the ISA hasn't yet been developed).

4. As cheap as an integrated graphics chip is, building it into the CPU is FAR cheaper (and as I said, uses much less power).

5. The only solutions for delivering a CPU based GPU (and remember that Intel is going this way as well, so there's probably a very good reason for it!) that AMD had were either ATI or nVidia...and nVidia was much more expensive. A company like S3 (who doesn't have any experience in higher end graphics and is owned by Via already) or developing a solution from scratch would take years longer and end up being far more expensive due to the delays, higher development costs, and loss of sales...
This has turned into a good discussion...

As for the nay-sayers (to my argument): how on earth can you justify an acquisition that brought two profitable companies into the red (over 100 million dollars lost per quarter)? The goal of AMD or any other company is to make money. The merger has brought them just the opposite.

Now, for Viditor:

1. Matrox and Via (who now own S3). There's also Trident, although their technology is primitive (for GPUs anyway). I do believe there is one other company that tried a few times to make a high-end GPU (I forget the name).

2. My guess is that even if AMD bought Matrox, they could help them make an IGP chipset quite easily.

3. I won't argue that point, but other companies aside from ATI could have helped them make one.

4. See point #3.

5. Yes it might have taken longer, but at least they would be in the game with cash in hand. There's also the fact that Fusion has taken ages to produce; really, how much longer would it have taken with Matrox?

Also, I fail to see why they can't just have a multi-die package similar to what intel does with their quad-core CPUs. Can they not simply 'glue' a GPU to a CPU somehow? AMD seems to be going for these elegant engineering solutions that come out late and slow (Barcelona and R600).

If I owned AMD stock, I would give them until Fusion comes out and then pull the plug if it doesn't compete well with whatever intel has out at t the time.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
I think I see the problem here...you are vastly underestimating the time it takes to develop a CPU.
From concept to shipping, the average time for CPU development (this is for AMD and Intel) is 5 years...and that's a normal CPU design.
Fusion began in 2006 supposedly, and is being released at the end of 2008 or beginning of 2009. That means that either AMD is years ahead of schedule, or they started in early 2004...
A good example of how this delay effects a chip is Itanium. Itanium began it's life as a concept at HP in 1989, and the chip began it's co-development with Intel in 1994...it wasn't released until 2001. If Itanium had been released on the same schedule that Fusion is supposed to be, we would never have heard of the Opteron and all chips would now be IA-64. So, instead of the $38 Billion in sales that Intel originally projected for Itanium back in 1999, they have actually been closer to $2.5 Billion (which is less than the development costs, so it is still at a fairly large loss).

You are also vastly overestimating the abilities of the engineers at Via and Matrox...

Don't you think that if the engineers at Matrox had the ability to develop a chipset based graphics solution they would have done so already? AMD has had a program in place for 5 years now whereby they will send their own engineers to any company developing for them, so it hasn't been because they had no support on the CPU side. In addition, they haven't made a high-end solution in many years...they have found a niche with media display devices (multi-headed displays), but that's about it. The engineers they used to have when they made more competitive products have long since left the company.

Via (and S3) has never made high-end solutions, and even so, they would have cost about the same as ATI did...
 

heyheybooboo

Diamond Member
Jun 29, 2007
6,278
0
0
Originally posted by: SickBeast
. . . . . . As for the nay-sayers (to my argument): how on earth can you justify an acquisition that brought two profitable companies into the red (over 100 million dollars lost per quarter)? The goal of AMD or any other company is to make money. The merger has brought them just the opposite. . . . . . . ..

Also, I fail to see why they can't just have a multi-die package similar to what intel does with their quad-core CPUs. Can they not simply 'glue' a GPU to a CPU somehow? AMD seems to be going for these elegant engineering solutions that come out late and slow (Barcelona and R600).....

A ""multi-die package"" is converse to AMD's Torrenza open-socket system of specialized independent co-processors and application accelerators, HyperTransport and many current and future aspects of AMD architecture. Integrating the IMC alone would be a monumental engineering undertaking . . .

AMD has never really been profitable. They cycle between periods of high cash flows and heavy debt. That AMD has survived over the years against Chipzilla is actually quite impressive. Intel spends more on branding & advertsing than AMD spends on R&D - though AMD has doulbed that expense over the last 3 years.

For the benefit of us all may AMD keep on truckin' . . . . :)



 

Regs

Lifer
Aug 9, 2002
16,666
21
81
I think I see the problem here...you are vastly underestimating the time it takes to develop a CPU.
From concept to shipping, the average time for CPU development (this is for AMD and Intel) is 5 years...and that's a normal CPU design.

I really think you're giving AMD too much credit saying AMD invested 5 years into the Barc.

The Barc's development was more like one year which became the offspring of a existing product in development. Quad Core at 65nm was the base design.

I can simply say that I've spent 10 years trying to develop the wheel and by year 8 I somehow ended up with a bowling ball.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: Regs
I think I see the problem here...you are vastly underestimating the time it takes to develop a CPU.
From concept to shipping, the average time for CPU development (this is for AMD and Intel) is 5 years...and that's a normal CPU design.

I really think you're giving AMD too much credit saying AMD invested 5 years into the Barc.

The Barc's development was more like one year which became the offspring of a existing product in development. Quad Core at 65nm was the base design.

I can simply say that I've spent 10 years trying to develop the wheel and by year 8 I somehow ended up with a bowling ball.

It may seem that way, but if you think about it, you know that the Barc has been in development for many years...
Remember that Barc is the K10 design, and we've known about the K10 design team for at least 3 years, and you can be assured that they were around for quite some time before we first heard of them.
It is certainly possible to accelerate a design so it only takes 3 years or so, Intel did it with the C2D. But to do that, Intel had to pirate a number of their other design teams and throw all of their resources behind C2D.
AMD didn't have Intel's resources, so Barcy was late...