Well using the standard gpu cores to do denoise means they can't be used for normal rasterised rendering at the same time. I suspect the tensor cores would do it better and free up the normal cores to do other stuff. A future patch of BF5 is meant to move over to using the tensor cores for denoising.
DX does need to add support for these, but right now the implementation is proprietary Nvidia - not just because they have the cards with it on, but also the all the training hardware and software belongs to Nvidia too.
Hence it's not just a matter of adding AI support, MS would also need the back end AI super computers and to write the software for those super computers to generate the algorithms - something I guess they will do for the Xbox 2 (which is bound to have some form of ray tracing/AI).
The thing is, much of that is done in the final stages, and it'll still be doing that even with ray-tracing. Its part of the final processing stage of the image, and so if the denoising is integrated into that final single pass as with some AA mode all the better. I'm not sure how much you'd really be freeing up there since it'd likely be done anyway. Frankly, I don't get running algorithms in the cloud just to push them to specialized hardware. You could do the same analysis and just use it to tweak already used methods. The other thing is that, without pushing the specialized hardware, you'd likely end up with more traditional rasterization transistors, and that can be used for more. So run analysis and tweak the settings for the traditional rasterizing, which you could already do (wasn't that even the point of GFE?).
Er, you do know Microsoft is the one that made the ray-tracing API, right? You also know they have already been doing a lot of that analysis (they touted extensive analysis feeding the design of Project Scorpio for instance) and were even running AI on their Azure cloud, right? That was something they touted with the One announcement, so its not some sudden thing. I'd guess they're ahead of everyone on AI implementation in gaming (as far as having the hardware in place and software implementation goes). Which, I expect Microsoft will do AI like they did with the One, largely in the cloud (again, I don't get this "figure the algorithim in the cloud only to push it to specialized hardware" approach when you can just skip the specialized hardware and just run it from the cloud).
And I'm skeptical that there will be much beyond superficial ray-tracing support (but maybe we'll see that allegedly faster/easier lighting method that was touted so much by Nvidia early with RTX) on the consoles. Unless Microsoft rolls their own ray-tracing hardware, in which case, RTX might be dead in the water for gaming unless its quite similar or is somehow a lot more efficient so that it could run Microsoft's version without much of a hit, since Microsoft is the one controlling the API, and so they'd likely know what would be good hardware to run what they want from the API. I'm sure they'll tout it, but much like how they touted DX12 on the One, I don't think it'll actually be much of a big deal (maybe they'll have some Geometry Wars esque game that can show it off; Sony has had some similar, where they were like sorta indie or even puzzle games that had big flashy graphics but they weren't terribly complex beyond that like typical modern games are). I personally don't see ray-tracing taking off til they put cloud resources behind it, where it'll provide the grunt to do it more extensively (so it'll actually wow people properly) and with better performance.
The other thing to take into account, Microsoft is already going to be moving at least a half-step towards doing things in the cloud as they move towards streaming. Microsoft themselves said that (they have 2 consoles coming, one is traditional, and then a second one is streaming focused, but will do some local processing since network isn't quite there for full game streaming). I really can't see them pushing specialized hardware to consoles when that stuff is almost guaranteed to be done in the cloud in the future. Furthermore, much of that is just integrated into the traditional GPU pipeline on AMD's GPUs, so it'd be weird to double up on it with more specialized hardware. They'd be smarter to just have the backend do it and stream it.
I expect they'll do some of this stuff and they'll talk it up, but I'm not expecting anything crazy. And the new CPU and GPU should bring substantial improvements, such that I think early games will focus on reaping those benefits (higher resolutions, draw distances, NPCs/onscreen characters, etc).