It is funny how you want to keep this civil where you begin this thread is a troll thread. I will try to break your argument one by one, in a civil way.
First, it is true that the cypress series from AMD is a success, but there is no guarantee for 6xxx. Look at Nvidia, with the success of the G92 core, they roll out the entire 2xx series with success. It is atleast as good as the 4xxx series from ATI at the time and could have simply recycle the design just like ATI did with there 4xxx series, but they didn't. The Fermi series are more or less a complete redesign in terms of architecture. While the first product isn't perfect, it is expected. GTX460 clearly indicated that the Fermi architecture works at multiple ends, unlike what people said "It isn't for gaming."
So now Nvidia has the first generation of their new architecture out, plus all sorts of CUDA, 3D and PhysX to back it up, what do ATI have? Are they going to recycle the 4xxx design again and use 25 nm this time? Sooner or later ATI needs to make a new design, and it will get a hit just like Fermi got. You can't expect something new just work without problems. Here is the catch though, if they get a hit like the one Nvidia got, they may not be able to get back like Nvidia.
How did Nvidia got back? Well with those proprietary stuffs. Those who have Nvidia 3D vision probably work even try an ATI card if it isn't like 150% performance with 50% the price. Those are utilizing the power of CUDA will not switch until ATI come up with something that can suit their needs. Yes some can hack their way through and use their old Nvidia card as a PhysX card, which many have done, but Nvidia's new drivers won't support such setup, so it is the matter of time before they sell their ATI card for a Nvidia card. Why do I say that? Well, they could have sold their old Nvidia card at the first place and ditch PhysX, but they didn't. An outdated Nvidia card can serve as a PhysX card. An outdated ATI card can do what?
The one thing the lead to the success of the cypress series wasn't what ATI has done right, but the amount of screw up Nvidia had made. Nvidia cease the production of GTX285 way too early. The reason for this action may be due to their confidence on the fermi architecture and they were so ready to mass produce it as soon as it is ready. On one hand my beloved Charlie was busy generating FUD about how Fermi is unmanufacturable, too big, and too hot while Nvidia was busy making their big hit. All was suppose to go well until TSMC delivered the message to both Nvidia and ATI about the manufacture problem they have on the 40nm yield. While ATI can modify their design with ease since it really isn't new, Nvidia was more or less screwed. They can't go all out with Fermi and redesign would take too long as the factory line has stopped. They have no other choice but to go forward with what they have, just not at full force. Rumors said that Fermi will have very limited supply due to yield, which was busted. Charlie's FUD about unmanufacturable was busted. Rumors about no tessellation was also busted. It was, however, power hungry as it isn't 100% efficient and generates too much heat.
Usually this wouldn't be as serious as it had been but it was at the time where window 7 and DirectX 11 first arrives. This is a golden time when consumer want to buy new hardwares and Nvidia has nothing to sell between september 2009 and march 2010. Mean while ATIs cypress was the only option for a new video card, which eventually lead to a unexpected sells that ATI have to increase the price to smooth out the demand. All OEMs were asking for goods and Nvidia really have nothing. The new isn't ready and the old has ceased. Just when things are bad enough on this end, their other product, Tegra2, had discovered a problem all the way down to its design level. Many new products (tablets) never made into market in time which benefited ipad, and therefore iphone.
As if things are not bad enough, some smart programmers actually cause some aftermarket HS fan not to spin with their new Nvidia driver. They new SC2 is a demanding game and will cause video card to get very hot. The intention is to adjust the fan speed (spin faster) but somehow some aftermarket HS fan decided not to follow instructions/specifications. Now we know that SC2 kills video cards, but at the time people believed that the driver is the solo cause of the dying cards. GG!
As you can see, Nvidia's wound was not caused by ATI, but TSMC and themselves. Of course public attention only focus on 480, and people relate the failure of everything to 480 as 480 is the only thing Nvidia ever made. Many believed that it is too big and too hot. The truth is, 480 is not bad and heat isn't a problem. I was expecting to find lots of threads about the death of GTX480 in the SC2 forum, at least a few. To my surprise, no deaths of 480! Many G92 chip based cards were fried, some ATI 4xxx and cypress, but no 480! Too hot? I don't think so. The idea of hot however, stayed.
So with the 6 months glory ATI had, people are slowly selling their ATI card for a Nvidia card even if it is a side-grade or down-grade. Why? Well there are things that you can only do with a Nvidia card. Those FUD about 480/470 will keep people away from them, which allows their new 460 to sell. Interestingly, 460 is indeed a better make, lower cost and higher yield. This leaves 460 the only card they should make, so they are massively producing it, and therefore able to reduce the cost of BoM. Eventually, 485/475 will replace 480/470, and dual core is not far away.
It is true that most cypress user ain't going to drop the card now because their is no reason to, but that is going to change really soon. Lots of 3d movies are coming out and not long until 3D home theaters kicks in. Lots of laptop manufacturers see this and are now using Nvidia chipset for Nvidia 3D. Where is the AMD's version? I hope the 6xxx series will retrofit 3D or else the market will swing back to Nvidia as soon as Avatar 3D video boxset arrives. IMO ditching the ATI name is not a smart thing AMD to do at this time.
Programmers are not gods. Think of them as smiths who craft softwares, and video cards and processing units are woods and nails. Without the proper tools, the quality of those woods and nails are meaningless. The thing about CUDA is not on the quaility, but the tools which drives it. The fact is, although 2xx architecture supports CUDA, the new Fermi architecture will allow better utilization when it comes to CUDA computing, think of this as making nails that goes into wood easier. This is an improvement of quality where ATI has not even begin. CUDA is based on C++, which is something all programmer knows, and Nvidia has provided CUDA development suite which allows programmers to forge softwares with, while using "open whatever" and/or ATI's stream is like trying to make a table with bare hands. Someday someone will invent the tools which can utilize them, but none has been invented yet. Even if AMD starts now they are still years behind.
Video games are important, more important than food some may believe. They also believe that the video card that pushes the most FPS is going to have the last laugh, and I agree. The question is, how do they get there? AMD creates quality woods and nails and hope someone will figure out how to utilize it. Nvidia creates quality woods, nails and the tools to use it. Who will have the last laugh?
Forget about theorycrafts, look at what we have now. Nvidia users are the first who experience Adobe Flash accelerated by GPU. Video editors are also benefited only by Nvidia users. When will ATI users have a taste at what GPU can do for them other than playing games?
Will 6xxx changes all this? Will fusion changes all this? We will see.