You mention price points but failed here. Iris Pro is ridiculously expensive compared to even mid-range cards. Its also not available for system builders but only through OEMs, currently only popular on select Apple models, which they charge a massive premium over non Iris Pro models.
Fine, I'll spend more than 30 seconds on Google:
http://www.techpowerup.com/reviews/Intel/Core_i7_4770K_Haswell_GPU/9.html
OMG! The performance differences are almost the same (because it's basically the same thing)! It needs a GT 640, which is anywhere from $80-100, to be substantially superior (but, unless all you do is Photoshop, you'd be stupid not to get a GTX 750 or GTX 750 Ti), and leaves no use, performance-wise, for ~$50 cards.
We'll revisit this discussion in 10 years time and all the doom and gloomers who claim dGPU is going to be obsolete will have mud in their faces, and I'll be enjoying my PC gaming on 8K monitors while iGPU will struggle at 4k on everything LOW.
Regardless of what things are like in 10 years, I'm not sure who the doom and gloomers are. Cost savings from on-package fast memory, ever-increasing efficiency, lower-cost barrier to entry for users, and ever-increasing performance for a given cost, over time, is hardly a case of doom and gloom.
I'll get whatever I need to for whatever I'm willing to pay, to play what's out in 10 years. Heck, by then I might even decide to stop using tilesets for Nethack, while I'm at it

.
P.S. Look at this way:
Today, CPUs seem to have gone up about $20 from a few years ago, for about the same thing (IE, i3 v. i3, i5 v. i5). Today, to match Haswell, you'll need about a GT 630. To match a good A10 w/ fast RAM, a GT 640 (some 640s will still have an edge, but not by much).
Players of RTSes, MMOs, etc., previously had to buy a HD 5670, HD 6670, GT 440, etc., just to get by, at low settings.
Those two price differences combined make for about $50 less. While Haswell and A10s can technically play some single-player action games, I am trying to be realistic. Well, $50 can get you...
From a Pentium to an i3 or A10.
From an i3 to an i5.
From a stock i5 to an overclockable i5 K (just barely, with CPU, HSF, mobo), or stock Xeon E3-123xVy.
From that i5 K or Xeon E3 to a stock i7.
From a bare-bones mobo and 4GB RAM stick to a fast 8GB kit (important for IGP) and mobo with decent IO.
Almost from 8GB RAM to 16GB RAM, should you do more than game and have a use.
From a 1TB HDD to a 256GB SSD.
From a 2TB HDD to a 120GB SSD and 1TB HDD.
From a cheap PSU on sale to a nice one not on sale.
From a cheap case to a case you'd fall in love with.
From your old beat up mouse to a new one.
A value headset, if you could use one and don't have it.
...and so on.
Better IGP means, for those capable of using it, better overall value. These people already play on low, and think people spending $500+ on video cards are wasting their money (I sure would be, but I couldn't manage with what they put up with, at least not for very long

). Now, 4K, we'll just have to see about, as today, no mainstream card has even 1/4 of what it will take to handle 4K (a GTX 770 can be pretty much saturated at 1080P, today), so I doubt that will come down from being high-end for some time.
Meanwhile, the same tech advances that could enable even faster IGPs, making that more like $75 or $100 over the next several years, will largely not affect you, or me, buying more expensive video cards, for quite some time, in the worst case. In the best case, they will mean even better cards for the money. Likely, nV and AMD will stop making the very lowest-end chips, and make cheap cards with salvage dies (I'm pretty sure today's GT 620 and GT 630 are just that, anyway). Also, since nV and AMD both know this is the way of the future, they will be planning for such things, and have already been shifting their R&D to account for it. By making the integrated and discrete GPUs use the same designs, just lightly tweaked (in nV's case, mobile, PC AIB, and server designs), most of what benefits the next gen's tablet benefits the next gen's Titan equivalent, since high-end GPUs are very much TDP and space limited.
It's not a future vision of graphics going to pot, but of being able to put higher-performance GPUs on the CPU than can be put on a card for the same cost to the buyer, enabled by being able to, relatively cheaply, get around the 15+ year old bandwidth limitations imposed by off-chip RAM interfaces. The die space to dedicate to it alone will take a shrink or two, only to still not be enough to beat midrange cards. So it's not something that's going to happen overnight, even if the technology to do it is as successful and affordable as it is alleged to be. All the companies involved will have plenty of time to scheme and adapt, including AMD finding some way to keep getting cash, for fear of an Intel monopoly

.