Of course there's plenty of incentive to develop on NVidia hardware. NVidia has the larger market share for discrete graphics for instance.. Also NVidia has extended feature sets like PhysX, CUDA, 3D Vision that developers may wish to take advantage of.
CUDA is totally irrelevant to gaming. PhysX less so, but still basically irrelevant and in terms of the consoles, it might as well not exist at all.
I could easily have said "Also AMD has extended feature sets like MLAA, Stream, Eyefinity that developers may wish to take advantage of." Of course there is no need, as the actual hardware is all that the devs really want to take advantage of.
Many of the next gen console games featured in last year's, and this year's E3 ran on NVidia hardware, not AMD (like Watch Dogs and TitanFall).
That may well be true right now before any games are released, but it will be much less likely going forward. There are no good (gaming) reasons to develop on Nvidia hardware.
The exact same reason that you applied at the start (larger market share for discrete) applies far more so with the consoles.
Here are the numbers we're realistically looking at -
Midrange to high end PC gamers (the people with strong enough gaming PC's) - 20 million, 13 million Nvidia, 7 million AMD
Console gamers - 10 million AMD, by the end of the year.
Each year afterward the consoles will likely rise by ~15-30 million while the PC gamers "upgrade", ie the same 20 million PC gamers as in the previous year. The whole reason of "more people use Nvidia cards" will be gone in the next 6 months. By the end of this year more gamers will be using AMD hardware than ever before, and the numbers are just going to keep rising.
Carmack is almost crying because he knows it's an AMD hegemony and he's always been an Nvidia fanboy.