e $150-$350 volume sellers. This would encompass the 750 Ti to the GTX 970, R7 260X to the R9 290. If Intel has that performance and proliferates it into other chips it would begin a really fast evisceration of Nvidia and AMDs dGPU market.
Things like the Titan X, 980 Ti, Fiji, 295X2, and other > $350 cards are tiny market share bragging rights cards, sort of like the Dodge Hellcat vs a normal Challenger. The only reason the exotic one exists is to sell more of the normal one. If normal challengers stop selling, you won't see any more hellcats (or challengers). Same for these video cards.
This is something I really don't want to see happen. So, I hope Intel's iGPU does in fact suck when put up in real benchmarks.
The car analogy doesn't really work. Sure, it's probably true for the Big 3 and their Japanese and Korean counterparts, but companies like Ferrari and Lamborghini don't sell any mainstream cars, only ultra-high-end performance models.
Anyway, it's worth pointing out that (as you note in the first paragraph) the GTX 970 is basically a mainstream GPU; sales are massive by all accounts, far exceeding even Nvidia's expectations. And assuming the Broadwell ~= GTX 750 comparison is roughly accurate, you need to at least
triple Broadwell's perfomance to get close to GTX 970 levels.
And while the GTX 980 Ti is no doubt a much lower selling card, sales aren't negligible; it's reported as being sold out almost instantly everywhere. And once 16nm FinFET+ comes along in late 2016 / early 2017, we should expect to see GTX 980 Ti levels of performance come down to GTX 970 price levels (and even lower power consumption). That's how it has usually worked in the past; new GPU generations usually mean everything gets bumped down a notch, with the new midrange being equivalent or superior in performance to the old high-end.
According to Intel's promo materials, Skylake is supposed to have about 60% iGPU performance improvement over Broadwell. That might put it roughly on par with Pitcairn - a midrange discrete GPU from 2012. A $200 discrete card today (GTX 960 or R9 285) can easily beat that. Even giving Intel the benefit of the doubt, it will probably be at least Cannonlake (if not longer) before they manage to get iGPU performance up to GM206/Tonga levels. And by that time, thanks to FinFET+, $200 discrete cards will be providing performance on level with GM204 at ~125W.
On top of all this, keep in mind that Intel reserves their eDRAM-reinforced iGPUs for the most expensive SKUs. For most users, an i3 with a cheap discrete GPU is going to offer better perf/$. If Intel wants to take over the market, they need to put the best iGPU on everything but low-power laptop and HEDT/server SKUs.
I just don't see Intel making discrete GPUs obsolete any time in the forseeable future.