What I meant about maturing is that graphics technology isn't changing dramatically like it was years ago. We aren't having the huge changeovers that were DX7 to DX8.x, and DX8.x to DX9. There isn't going to be another DX version until Longhorn, which will be arriving in mid-to-late 2006 or early 2007, and that means no SIGNIFICANT advances in graphics tech until then. Sure, there is the innevitable new feature or new technology, as well as speed increases, but the BASIC way things are done hasn't changed since the first DX9 chips (i.e. shaders) such as R300 and NV30. Of course, the architectures have improved, and in NVIDIA's case, completely revamped. But there's still basically the same limit on the number of processes that can be done (though it has increased with the advent of SM3.0 and SM2.0a\2.0b). As I said, of course GPU's will get faster, and there will be some kind of new technology that will emerge, but as far as graphics maturation is concerned, developers will learn to better take advantage of the technology that is currently there, but there won't be anything significantly new that developers will have to learn.
Make sense?