To your "Secondly", AMD's APU are cheaper, use around the same power(currently) and perform way better in games at same and above resolutions and more compatible with DirectX/OpenGL versions, and the great majority of customers won't upgrade unless a few years passed.
See, this is the problem; people carry out these analyses under the assumption that a significant number of PC users not only play games, but care enough about performance to just the right extent where an Intel iGPU is not adequate for their needs, but a discrete GPU (of the mobile or desktop variety) is overkill. You can't just start out with an assumption like that and then use it to prove that AMD's products are better than Intel's for most people.
To your "thirdly", most don't until they try to play a game and realise it just wont run at all because of lack of the feature to run, mainly DirectX/OpenGL version or because the framerate is a horrible slideshow, literally.
I think your impressions of Intel's iGPUs and graphics drivers are several years out of date. Not that I can entirely blame you, since it's something that even most Intel enthusiasts don't care about, but the last time Intel was at a major disadvantage in terms of DirectX feature set was back with Sandy Bridge, which was a DX generation behind Llano. Since Ivy Bridge, Intel's iGPUs have have feature sets comparable with AMD's. And Intel's drivers, while not
as good as AMD's or nVidia's, are far from the broken, bug-filled mess they were back in the 2000s.
And I know Intel has the majority share in graphics, that's the point of the topic.
Intel also had the majority share of the graphics market back in the days when their Extreme Graphics chipset didn't run with half of the games out there, and the ones that did work could barely manage 20fps with nearest-neighbour filtering at 640x480, while the nForce 2 could run them silky-smooth with AF filtering at 1024x768. That should tell you how many users out there just don't give a damn about PC gaming, and never have.
The topic is about why AMD APU's failed. This isn't about high-end users or even mid-end users, I have a dGPU in my system after all, an AMD APU can do a more important thing than what an Intel APU does and that is, if the user would like to play a game, and most users do
Most users like playing the type of games at the sort of settings that would render an Intel iGPU inadequate? Not in my experience.
Is installing a program 5-50% faster better than playing any game 2-3 times faster and with better quality, more important to 80% of users?
That might be a valid argument if the products we were comparing were Sandy Bridge and Llano. With Haswell and Kaveri however, it's more like a 30-40% difference with largely equivalent image quality. And considering how many PCs sold nowadays are notebooks with crappy 1366x768 screens, Haswell's iGPU is pretty much adequate for those low standards.
Most people don't install that many large programs, most people will play some form of video game. faster and better video, faster and better gaming
Some form of video game, yes. Unfortunately for AMD, they're not the type of game that will significantly benefit from a better iGPU. They're Flash games which, thanks to Adobe's
genius programming skills, will benefit more than anything from the strong single-thread performance of Intel's line up (though in practice, will perform about the same on both company's chips).
AMD had the upper hand in that for years and it failed in the market. It was not because of a bad product, it was because of bad marketing and Intel being a genius at marketing.
That's half correct. AMD haven't failed because they have an entirely
bad product per se... they failed because their product, for most people's purposes, wasn't appreciably any better than Intel's, while having several major disadvantages in weaker single-thread performance, higher power consumption and not being able to directly upgrade to the Phenom II or FX line. Intel's marketing and better OEM deals just helped further seal the deal.