ATI+AMD confirmed

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blckgrffn

Diamond Member
May 1, 2003
9,686
4,345
136
www.teamjuchems.com
Ugh...

I think the independence was a good thing. There was always a few ways to have it, combining ATI, nVidia chipsets and GPUs along with AMD CPU's and Intel CPU's & Chipsets...

Less choice is always a bad thing, and that is the only way I see this going.

Nat

*I doubt nVidia is in huge trouble. It's not like the nforce5 and the current 7xxx line of GPUs are just going to dry up and go away. If they slow down, 2 years down the road might be trouble, but hasn't that always been the case?
 

jlbenedict

Banned
Jul 10, 2005
3,724
0
0
This is crazy...honestly... there are so many scenerios that could happen in the future now as a result of this..

1) ATI's high end GPU sector is prioritized less; Nvidia now becomes what Creative Labs is in the audio sector = high prices/monopoly for the high end GPUs

2) AMD has their previous motherboard chipset background; that along wiith ATI's latest entry into the chipset wars now combines with the R&D of AMD; AMD says "fvck it.. we'll produce a majority of our own chipsets now" and results as AMD competing directly head to head with Intel; Nvidia gets the shaft and is on their own

3) As a result of the above #2, Nvidia now merges with some other company.. who knows who though...

4) This merger seals the fate of Via. I mean.. where have they been? K8T900 was "verbally" released, but where is the damn product.

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
VIA are where they belong - in sub $600 (australian) machines with soldered on cpu's. Silly things can barely run a web browser...
 

Vinnybcfc

Senior member
Nov 9, 2005
216
0
0
Originally posted by: Gstanfor
VIA are where they belong - in sub $600 (australian) machines with soldered on cpu's. Silly things can barely run a web browser...

QFT, Would not touch a via chipset ever again
 

Sc4freak

Guest
Oct 22, 2004
953
0
0
I'm going to go cry in a corner. PC Gaming = doomed. PC Video card market = doomed.

Nvidia's all that's left. And that 'aint good for consumers. No competition = we are doomed.
 

jlbenedict

Banned
Jul 10, 2005
3,724
0
0
Originally posted by: Sc4freak
I'm going to go cry in a corner. PC Gaming = doomed. PC Video card market = doomed.

Nvidia's all that's left. And that 'aint good for consumers. No competition = we are doomed.
I got this email back from ATI public relations:

"Hi Joseph,

Our high-end GPU business will be an important part of our growth strategy. We are committed to the enthusiast and want to win in all segments. Stay tuned for more great products.

Dave



--------------------------------------------------------------------------------
From: Joseph
Sent: Monday, July 24, 2006 10:21 AM
To: Dave Erskine (derskine@ati.com); public.relations@amd.com
Subject: Consumer Inquiry On AMD/ATI Merger


Good morning. I'm writing on behalf of myself, as a computer enthusiast consumer. I'd like to know how this will effect ATI's high end GPU product line. Will the high end products take less priority? This merger is exciting news that is positive for the outlook on the computing industry, but I also get this feeling that there could be negative results; mainly on GPU technology. It seems this positions Nvidia to monopolize the high end sector of the graphics market and this is not a good outcome for myself and other consumers.

If you are able to comment and answer my questions and concerns I'd greatly appreciate it.

Thanks in advance.

Joseph
 

Greenman

Lifer
Oct 15, 1999
22,236
6,431
136
Originally posted by: Sc4freak
I'm going to go cry in a corner. PC Gaming = doomed. PC Video card market = doomed.

Nvidia's all that's left. And that 'aint good for consumers. No competition = we are doomed.

All it means is that the high end nvidia cards will cost more, maybe a lot more.
I'm wondering whats going to happen to all of nvidias paid shills. Will they still be talking about how great nvidia is when the free hardware stops coming? nvidia won't need them if ATI drops out of the high end. I can't wait to see what Crusader has to say when he has to go out and buy new hardware, rather than get it free.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
it ain't all doom and gloom :p

withOUT the merger AMD would be doomed - gone in a few years . . . and that would leave just Intel, ATi and Nvidia . . . i think i prefer to have AMD around and i don't think they will EVER let nVidia "have" the performance crown. ;)
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: jlbenedict
I got this email back from ATI public relations:

"Hi Joseph,

Our high-end GPU business will be an important part of our growth strategy. We are committed to the enthusiast and want to win in all segments. Stay tuned for more great products.

Dave
Until "Dave" gets an Email from AMD corporate that says "he's no longer needed as his job is already being done in California by an AMD employee". :thumbsdown:

I agree that this is probably a good thing for AMD. It is a bad thing for ATI though. Most of their non-engineering employees will be layed off, AMD will probably take all of the high end video card engineers and have them work on processors killing the ATI high end market and we as consumers will end up getting screwed (again).

 

Mem

Lifer
Apr 23, 2000
21,476
13
81
withOUT the merger AMD would be doomed - gone in a few years . . . and that would leave just Intel, ATi and Nvidia . . . i think i prefer to have AMD around and i don't think they will EVER let nVidia "have" the performance crown

I think AMD would of survived on their own ,like they've done for the many years ,however I think in the long term this will make the company stronger and benefit from the merger.
 

jlbenedict

Banned
Jul 10, 2005
3,724
0
0
Originally posted by: Wreckage
Originally posted by: jlbenedict
I got this email back from ATI public relations:

"Hi Joseph,

Our high-end GPU business will be an important part of our growth strategy. We are committed to the enthusiast and want to win in all segments. Stay tuned for more great products.

Dave
Until "Dave" gets an Email from AMD corporate that says "he's no longer needed as his job is already being done in California by an AMD employee". :thumbsdown:

I agree that this is probably a good thing for AMD. It is a bad thing for ATI though. Most of their non-engineering employees will be layed off, AMD will probably take all of the high end video card engineers and have them work on processors killing the ATI high end market and we as consumers will end up getting screwed (again).

yeah.. its funny.. of course "Dave" is going to say that they are commited to the high end.. yeah. .whatever..
I'm waiting on a reply from AMD. I sent the same email to both public relations offices. Should be interesting. :)






 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Mem
withOUT the merger AMD would be doomed - gone in a few years . . . and that would leave just Intel, ATi and Nvidia . . . i think i prefer to have AMD around and i don't think they will EVER let nVidia "have" the performance crown

I think AMD would of survived on their own ,like they've done for the many years ,however I think in the long term this will make the company stronger and benefit from the merger.

well, read this . . . for ONCE [or twice] iagree with theInq [more-or-less]:
AMD has to buy ATI to survive - Analysis: And it doesn't spell doom for Nvidia

it's kinda long so i won't quote it . . . yet. :p

 

Hard Ball

Senior member
Jul 3, 2005
594
0
0
AMD and ATI merger is all about technologies and integration:

http://www.dailytech.com/article.aspx?newsid=3471&ref=y
Specifically, it appears as though AMD and ATI are planning unified, scalable platforms using a mixture of AMD CPUs, ATI chipsets and ATI GPUs. This sort of multi-GPU, multi-CPU architecture is extremely reminiscent of AMD's Torrenza technology announced this past June, which allows low-latency communications between chipset, CPU and main memory. The premise for Torrenza is to open the channel for embedded chipset development from 3rd party companies. AMD said the technology is an open architecture, allowing what it called "accelerators" to be plugged into the system to perform special duties, similar to the way we have a dedicated GPU for graphics.

Furthermore, AMD President Dirk Meyer also confirmed that in addition to multi-processor platforms, that "as we look towards ever finer manufacturing geometries we see opportunity to integrate CPU and GPU onto the same [die]." However, Meyer also went on to clarify that this sort of technology might be limited to specific customers first. A clever DailyTech reader recently pointed out that AMD just recently filed its first graphics-oriented patent a few weeks ago. The patent, titled "CPU and graphics unit with shared cache" seems to indicate that these pet projects at AMD are a little more than just pipe dreams.

During the AMD/ATI merger conference call, ATI CEO David Orton furthermore added that not too long ago, floating point processing was done on a separate piece of silicon. Meyer and Orton both claimed that the trend for the FPU integration into the CPU may not be too different than the evolution of the GPU into the CPU.


http://www.theinquirer.net/default.aspx?article=33219

On the future x86 uarchitecture direction:
Let's look at this long term, say five or so years, the design cycle of a modern CPU. As we've noted earlier, the X86 CPU is about to take a radical turn, and the designs you will see at the turn of the decade won't resemble anything you see now. What do we mean by that? Mini-cores and Larrabee.

Until Sun came out with Niagara, modern CPUs were big, fast, hot out of order execution (OoO) beasts that ran a thread as fast as possible. Programmers were stupid creatures that had to have their work done for them in hardware, and elegance was the domain of game developers of yore. Fat cores were in.

Then came Sun with a hard left turn, lots of little, stupid cores that can do more in aggregate that a single big core. It had been tried in the past, but not with a modern ISA for mainstream use. If your application fit the bill, in Sun's case, this meant no FP code more than anything else, it simply flew. If it did not fit, well, you had problems. Can we offer you one of our other more conventional products?

The first salvo in the modern mini-core wars was fired, and the world changed. Now, Sun is on the verge of releasing Niagara II, and Niagara III is sure to follow. Intel was not about to let this winning strategy go unchallenged, and now has enough mini-core projects going to fill a phone book.

Kevet and Keifer were a mini-core and a CPU made of 32 of those cores respectively aimed at server workloads. It was four times what Niagara was reaching for, but also five years later. Intel is going for the swarm of CPUs on a slab approach to high performance CPUs, and more importantly, is going to upgrade the chips on a much swifter cycle than we've been used to.

With 32 small and simple cores, you can design each core much more quickly than a normal CPU, much more quickly. Design complexity, verification and other headaches make things almost a geometrically increasing design problem. A small core cut and pasted 32 times can mean smaller teams doing more real work instead of busy work, and more teams tweaking things for niches.

We think Intel is aiming at a much more rapid design upgrade cycle, most likely yearly, and much more niche-aimed CPUs. If you can make a new core with 1/10th the effort, and put it in an already existing and verified infrastructure/interconnect, then you can revamp your line up with a rapidity that would be flat out impossible to do today.

Now, if you add in GPU functionality to the cores, not a GPU on the die, but integrated into the x86 pipeline, you have something that can, on a command, eat a GPU for lunch. A very smart game developer told me that with one quarter of the raw power, a CPU can do the same real work as a GPU due to a variety of effects, memory scatter-gather being near the top of that list. The take home message is that a GPU is the king of graphics in todays world, but with the hard left turn Sun and Intel are taking, it will be the third nipple of the chip industry in no time.

Basically, GPUs are a dead end, and Intel is going to ram that home very soonAMD knows this, ATI knows this, and most likely Nvidia knows this. AMD has to compete, if it doesn't, Intel will leave it in the dust, and the company will die. AMD can develop the talent internally to make that GPU functionality, hunt down all the patents, licensing, and all the minutia, and still start out a year behind Intel. That is if all goes perfectly, and the projects are started tomorrow.

The other option is to buy a team of engineers that produce world-class products, are battle tested, and have a track record of producing product on the same yearly beat Intel is aiming for. There are two of these in existence, ATI and Nvidia. Nvidia is too expensive, and has a culture that would mix with AMD like sand and Vaseline. That leaves ATI, undervalued and just as good.

So build versus buy for long term strategic competitiveness, the choice is obvious, you have to buy. This will put AMD about 12-18 months behind the first of the mini-cores from Intel, about the range AMD is behind for everything else. Intel bites the bullet and proves the market, then AMD steps in. Here, AMD is going to let Intel do the heavy lifting, and then waltz in at the right time.

Long term, buying ATI is the only thing AMD can do to survive. It will bring some short term pain, and Wall Street will simply not have a clue once again, but there is no doubt that it is a necessary thing.

The more interesting time is mid-term, in the year to three year range. ATI has two sets of deep engineering knowledge that AMD can suck in and benefit from, memory controllers and PCIe. AMD is integrating both into the CPU, so ATI engineering can help greatly there. On the flip side, AMD has world-class manufacturing facilities that ATI can make GPUs and chipsets on without paying an arm and a leg to use. This is a win/win.


And in AMD's long term relationship with NV:
That brings us to the AMD and Nvidia relationship. Everyone thinks for some reason that this is a stab in the eye at its close partner Nvidia, and it will mean doom for both sides if they are forced to break things off. Two problems, they won't have to break things off and it could be good for Nvidia.

If AMD plays its cards right, and it has shown that it is quite savvy in corporate dealings of late, it can turn this into a win/win long term. The reason for that is this deal has nothing to do with GPUs or chipsets at all, it is about technology and engineers. Think long term.

That brings us to the long term side of things, and here all is good for AMD. In fact, if this deal does not happen, AMD will be out of business in five years, or at least out of the CPU business. In that time frame, this is good for AMD, good for Nvidia, and good for all the AMD partners. It kind of stinks for Intel though, but that is a lot of what AMD was aiming for.

... ...

That brings us back to Nvidia, and why I think it is not a death-blow for Jen Hsen's company. If AMD is smart, it can placate Nvidia by ceding markets to it, and keeping the platform open. We think it will. AMD has never been one to shut out partners as a blatant money grab like Intel. It sees things long term, and if you look at what it is doing with the Torrenza platform, you can see it has no intention to close the market.

Nvidia can have a big piece of the new pie in the future, things are not bleak at all. Short term heart attacks aside, the technology it brings to the table is still in great demand. Remember, this deal is not about AMD making GPUs or chipsets, it is about it making a completely different set of cores. GPUs and chipsets will still be in great demand, it is just that ATI won't be making them.

The net effect is good for ATI, good for AMD, and good for everyone else, including all the current AMD partners.

The "piece of new pie" for NV from AMD, will undoubtedly be Torrenza integration into future AMD platforms, which will be extended to its partners, as well as any successor to Torrenza. The same will go for other strategic partners, such as Broadcom, Sun, DRC, Xilinx, (and possibly Ageia in the future, if they choose to strike a deal with AMD), etc, etc.

 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
Originally posted by: Gstanfor
Originally posted by: gersson
NO!!!!!!!!!!!

PC gaming is doomed! Crysis, Spore and UT07 will be the last of the great PC games
:(

Care to explain why?

lol the only reason you are asking is prob because you think I'm insinuating that nvidia sucks
:roll:

Because Nvidia will be the sole high end gpu maker. Nvidia will obviously take advantage of the monopoly situation. Less people will pay those high prices; It will drive away gamers who will see better graphics and games on XBOX 360 and PS3. Less people buying means less games on PC.

Vicious cycle. BTW, if my response angers someone, then clearly you give more credit to my conjuration than even I do :p
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
LOL! I was just interested in your opinion. I can remember a time between roughly GF1 - GF4 when nvidia was effectively a monopoly (none of the "competition" did much in the way of actually competing anyway...) and that era was responsible for some great 3D games, in fact the 3D scene pretty much exploded during that timeframe.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
LOL! I was just interested in your opinion. I can remember a time between roughly GF1 - GF4 when nvidia was effectively a monopoly (none of the "competition" did much in the way of actually competing anyway...) and that era was responsible for some great 3D games, in fact the 3D scene pretty much exploded during that timeframe.

you mean during the Rage to Radeon 8500 timeframe?

ATi existed as a worthy 'alternative' to nvidia's faster GPUs . ..since Rage Fury 32. . . they always had better IQ - at least thru the R8500 era.


and don't worry about PC gaming . . . it doesn't matter who is making GPUs . . . it has enough problems surviving competition from the Consoles. :p

:D

 

Mem

Lifer
Apr 23, 2000
21,476
13
81
well, read this . . . for ONCE [or twice] iagree with theInq [more-or-less]:

I still stand by my post,you are entitled to your post/remark, so am I ;),bottomline I think it is a good thing regardless.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Mem
well, read this . . . for ONCE [or twice] iagree with theInq [more-or-less]:

I still stand by my post,you are entitled to your post/remark, so am I ;),bottomline I think it is a good thing regardless.

i wasn't really disagreeing with you . . just expanding on the subject . . . i think AMD would also "survive" . . . just not as well as with the merger. ;)

that was a damn good deal for AMD.
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
I think AMD would have been fine without the merger. It's going to take some time to get new ATI/AMD products out, so there shouldn't be too much of an immediate advantage. However, they will complement each other nicely on the integrated front, and torrenza. However, I think K8L would have kept AMD competitive with Intel regardless of this merger, but every little bit helps.

I'm just disappointed that we're only going to have one high end GPU maker after a short while. I can't really think of anyone that will step it up. I doubt Intel is going to try to wedge their foot in the door, but they could if they wanted. XGI sold off its graphics division quite a while ago, so that counts them out. Matrox probably doesn't have the resources to take on the 3D high end graphics realm again. That really only leaves S3, unless a new company just pops up out of nowhere.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
why do you think ATi will stop competing with nVidia?
:Q

the high-end will give a good boost to both AMD and ATi . .. saying they have the performance crown in *everything*.

at least i hope their marketing improves
 

RajunCajun

Senior member
Nov 30, 2000
213
0
0
Time will tell how this plays out, but I think it's safe to say that add-in GPU buyers will have little or no choice as to what to buy in the future unless another major player pops up.

On thing not talked about here - what will happen to ATI driver revisions/updates in the future? For everyone's sake, I would hope that driver technology will continue to evolve for older generation cards to support newer features as best they can. But I wouldn't hold my breathe over it.

Also, this has got to impact game designers in future games. I'm sure there are people right now agonizing over this situation.

Lost of competition will hurt EVERYBODY in the long run.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Hard Ball
AMD and ATI merger is all about technologies and integration:

http://www.dailytech.com/article.aspx?newsid=3471&ref=y
Specifically, it appears as though AMD and ATI are planning unified, scalable platforms using a mixture of AMD CPUs, ATI chipsets and ATI GPUs. This sort of multi-GPU, multi-CPU architecture is extremely reminiscent of AMD's Torrenza technology announced this past June, which allows low-latency communications between chipset, CPU and main memory. The premise for Torrenza is to open the channel for embedded chipset development from 3rd party companies. AMD said the technology is an open architecture, allowing what it called "accelerators" to be plugged into the system to perform special duties, similar to the way we have a dedicated GPU for graphics.

Furthermore, AMD President Dirk Meyer also confirmed that in addition to multi-processor platforms, that "as we look towards ever finer manufacturing geometries we see opportunity to integrate CPU and GPU onto the same [die]." However, Meyer also went on to clarify that this sort of technology might be limited to specific customers first. A clever DailyTech reader recently pointed out that AMD just recently filed its first graphics-oriented patent a few weeks ago. The patent, titled "CPU and graphics unit with shared cache" seems to indicate that these pet projects at AMD are a little more than just pipe dreams.

During the AMD/ATI merger conference call, ATI CEO David Orton furthermore added that not too long ago, floating point processing was done on a separate piece of silicon. Meyer and Orton both claimed that the trend for the FPU integration into the CPU may not be too different than the evolution of the GPU into the CPU.


http://www.theinquirer.net/default.aspx?article=33219

On the future x86 uarchitecture direction:
Let's look at this long term, say five or so years, the design cycle of a modern CPU. As we've noted earlier, the X86 CPU is about to take a radical turn, and the designs you will see at the turn of the decade won't resemble anything you see now. What do we mean by that? Mini-cores and Larrabee.

Until Sun came out with Niagara, modern CPUs were big, fast, hot out of order execution (OoO) beasts that ran a thread as fast as possible. Programmers were stupid creatures that had to have their work done for them in hardware, and elegance was the domain of game developers of yore. Fat cores were in.

Then came Sun with a hard left turn, lots of little, stupid cores that can do more in aggregate that a single big core. It had been tried in the past, but not with a modern ISA for mainstream use. If your application fit the bill, in Sun's case, this meant no FP code more than anything else, it simply flew. If it did not fit, well, you had problems. Can we offer you one of our other more conventional products?

The first salvo in the modern mini-core wars was fired, and the world changed. Now, Sun is on the verge of releasing Niagara II, and Niagara III is sure to follow. Intel was not about to let this winning strategy go unchallenged, and now has enough mini-core projects going to fill a phone book.

Kevet and Keifer were a mini-core and a CPU made of 32 of those cores respectively aimed at server workloads. It was four times what Niagara was reaching for, but also five years later. Intel is going for the swarm of CPUs on a slab approach to high performance CPUs, and more importantly, is going to upgrade the chips on a much swifter cycle than we've been used to.

With 32 small and simple cores, you can design each core much more quickly than a normal CPU, much more quickly. Design complexity, verification and other headaches make things almost a geometrically increasing design problem. A small core cut and pasted 32 times can mean smaller teams doing more real work instead of busy work, and more teams tweaking things for niches.

We think Intel is aiming at a much more rapid design upgrade cycle, most likely yearly, and much more niche-aimed CPUs. If you can make a new core with 1/10th the effort, and put it in an already existing and verified infrastructure/interconnect, then you can revamp your line up with a rapidity that would be flat out impossible to do today.

Now, if you add in GPU functionality to the cores, not a GPU on the die, but integrated into the x86 pipeline, you have something that can, on a command, eat a GPU for lunch. A very smart game developer told me that with one quarter of the raw power, a CPU can do the same real work as a GPU due to a variety of effects, memory scatter-gather being near the top of that list. The take home message is that a GPU is the king of graphics in todays world, but with the hard left turn Sun and Intel are taking, it will be the third nipple of the chip industry in no time.

Basically, GPUs are a dead end, and Intel is going to ram that home very soonAMD knows this, ATI knows this, and most likely Nvidia knows this. AMD has to compete, if it doesn't, Intel will leave it in the dust, and the company will die. AMD can develop the talent internally to make that GPU functionality, hunt down all the patents, licensing, and all the minutia, and still start out a year behind Intel. That is if all goes perfectly, and the projects are started tomorrow.

The other option is to buy a team of engineers that produce world-class products, are battle tested, and have a track record of producing product on the same yearly beat Intel is aiming for. There are two of these in existence, ATI and Nvidia. Nvidia is too expensive, and has a culture that would mix with AMD like sand and Vaseline. That leaves ATI, undervalued and just as good.

So build versus buy for long term strategic competitiveness, the choice is obvious, you have to buy. This will put AMD about 12-18 months behind the first of the mini-cores from Intel, about the range AMD is behind for everything else. Intel bites the bullet and proves the market, then AMD steps in. Here, AMD is going to let Intel do the heavy lifting, and then waltz in at the right time.

Long term, buying ATI is the only thing AMD can do to survive. It will bring some short term pain, and Wall Street will simply not have a clue once again, but there is no doubt that it is a necessary thing.

The more interesting time is mid-term, in the year to three year range. ATI has two sets of deep engineering knowledge that AMD can suck in and benefit from, memory controllers and PCIe. AMD is integrating both into the CPU, so ATI engineering can help greatly there. On the flip side, AMD has world-class manufacturing facilities that ATI can make GPUs and chipsets on without paying an arm and a leg to use. This is a win/win.


And in AMD's long term relationship with NV:
That brings us to the AMD and Nvidia relationship. Everyone thinks for some reason that this is a stab in the eye at its close partner Nvidia, and it will mean doom for both sides if they are forced to break things off. Two problems, they won't have to break things off and it could be good for Nvidia.

If AMD plays its cards right, and it has shown that it is quite savvy in corporate dealings of late, it can turn this into a win/win long term. The reason for that is this deal has nothing to do with GPUs or chipsets at all, it is about technology and engineers. Think long term.

That brings us to the long term side of things, and here all is good for AMD. In fact, if this deal does not happen, AMD will be out of business in five years, or at least out of the CPU business. In that time frame, this is good for AMD, good for Nvidia, and good for all the AMD partners. It kind of stinks for Intel though, but that is a lot of what AMD was aiming for.

... ...

That brings us back to Nvidia, and why I think it is not a death-blow for Jen Hsen's company. If AMD is smart, it can placate Nvidia by ceding markets to it, and keeping the platform open. We think it will. AMD has never been one to shut out partners as a blatant money grab like Intel. It sees things long term, and if you look at what it is doing with the Torrenza platform, you can see it has no intention to close the market.

Nvidia can have a big piece of the new pie in the future, things are not bleak at all. Short term heart attacks aside, the technology it brings to the table is still in great demand. Remember, this deal is not about AMD making GPUs or chipsets, it is about it making a completely different set of cores. GPUs and chipsets will still be in great demand, it is just that ATI won't be making them.

The net effect is good for ATI, good for AMD, and good for everyone else, including all the current AMD partners.

The "piece of new pie" for NV from AMD, will undoubtedly be Torrenza integration into future AMD platforms, which will be extended to its partners, as well as any successor to Torrenza. The same will go for other strategic partners, such as Broadcom, Sun, DRC, Xilinx, (and possibly Ageia in the future, if they choose to strike a deal with AMD), etc, etc.

Interesting. Talk about unified!! ATI went from planning to have unified shaders to unified......CGPU's!!

Also it is interesting that Nvidia is still kept in the game becuase AMD wants to keep them in it. If AMD turned them away and left them to go to Intel, who they have been competeing with as far as chipset manufacturing, I don't think it would be pretty either. This whole thing might turn out to be great in the long run, but as far as immediate futures go, I don't see the positives to the new limited sector.