Originally posted by: Viditor
Originally posted by: Arkaign
Originally posted by: Viditor
Originally posted by: keysplayr2003
Originally posted by: SickBeast
Originally posted by: Killrose
What do I want from AMD? Basically some sort of performance parity.
It wasn't very long ago that an AMD/ATI combo would destroy anything else on the market. Now you put them together and get second-best performance on both accounts (and really there are only 2 major companies making each type of product anyway).
It's almost as though they don't care anymore. They miss deadline after deadline with products that don't compete well anyways. Their PR/marketing departments repeatedly lie.
Really AMD is a company with enormous potential. I was pretty excited when they purchased ATI and I thought that they would be a force to reckoned with. So much for that idea...
If AMD sold off ATI, they would be gone in two years. They "needed" to buy ATI. They may have learned that Intel was going into CPU/GPU/IMC all in one chip and needed to act. Or they may have been ahead of everyone else and anticipated this is where the market would go.
They can't sell their ATI division (if that's what it's called). Nehalem, let alone Larrabee, will have integrated graphics as per Intels IDF four core Nehalem preview.
AMD has a substantial amount of growing pains to deal with. They obviously need more time to iron things out.
And lastly, what the heck is Nvidia going to do when all this comes to town? Of course they will have a customer base for their discrete graphics GPU's, but for how long? I know it will take years, but discrete graphics might actually go away, depending on how "Fusion-like" and "Larrabee-like" the market is. Nvidia needs to follow suit. Anyway, I'm not trying to go OT on ya, just thinking about the repercussions of AMD selling ATI in light of Intels roadmap, and also thinking about what Nvidia might do.
QFAT (Quoted For Absolute Truth)
And here I was, despairing that nobody else was really understanding this very simple fact...thanks keys!
AMD didn't spend all of that money on ATI because they
wanted to, they well knew what financial straits lay ahead of them and how hard it was going to hit them in the short term.
AMD bought ATI because the absolutely HAD to, or risk being permanantely buried in another 3-4 years.
Just as muti-core
was the obvious "next thing" (as leakage claimed more and more efficiency from any and all advances), now CPU/GPU is the obvious "next thing" going forward...it enhances design at almost every level and decreases system cost and power.
Both Intel and AMD have known this and have been making acquisitions in order to get to a level of execution.
Your logic is ludicrous. AMD has made chipsets before, and AMD could have made their own video products. AMD could have bought a *much* less expensive graphics company. There were so many alternatives, and still are. *MUST* buy ATI? Please. CPU+GPU in one package is also overrated, when a decent video-integrated chipset is dirt-cheap anyway. GPU performance changes too quickly to think that a gpu/cpu combo will be relevant for more than 6-12 months at a time in the BEST scenario.
If AMD is still kicking in 12-18 months, I'd be surprised. It's looking like they will go the way of 3dfx/Cyrix.
Ahhh...the good old "let's use magic instead of science" argument.
1. Please list for me the number of experienced integrated graphics companies that "AMD could have bought"...
you can't just wave a magic wand and have a graphics product, nor can you just hire Moe, Larry, and Curly and have them do it for you! There's a very good reason companies like ATI and Nvidia have 1000s of engineers on the payroll...
2. Developing the original chipsets cost AMD more than developing the CPUs because they had to hire a whole new division to do it. Their chipset division was still very, very small when they bought ATI, and it didn't develop anything like integrated chipsets (you DO know that there are many different kinds of chipsets, yes?).
3. CPU+GPU allows for a significant reduction in power usage on both mobile and server platforms (systems that don't require high end graphics), and once a new graphics ISA is developed (which is what the CTM project has been doing) it should allow for higher end graphics as well. Note that this graphics solution will look nothing like what is currently in use...they will probably be more of a GPU cluster on-die with direct access to "graphics-tagged" threads from the cache (this is a guess as the ISA hasn't yet been developed).
4. As cheap as an integrated graphics chip is, building it into the CPU is FAR cheaper (and as I said, uses much less power).
5. The only solutions for delivering a CPU based GPU (and remember that Intel is going this way as well, so there's probably a very good reason for it!) that AMD had were either ATI or nVidia...and nVidia was much more expensive. A company like S3 (who doesn't have any experience in higher end graphics and is owned by Via already) or developing a solution from scratch would take years longer and end up being far more expensive due to the delays, higher development costs, and loss of sales...