Are you sure? Unless Apple created some magic that any other industry or academic engineers haven't figured out...
Off-topic a little bit, but this is exactly what happened with iOS and MobileSafari. I think it's for another discussion, but please feel free to point out how Apple implemented GPU acceleration on iOS in this instance. I haven't seen any industry or academic engineers figure that part out. At least at Google's HQ. They have partially figured it out at Microsoft for sure, but not quite.
upsampling 44.1kHz to 96kHz typically is not a good idea; it should be an integer multiple of 44.1kHz. All the sampling rate means is that the digital waveform can accurately reproduce half of that frequency - hence the reason for 44.1kHz in the first place (22.05kHz max real audio frequency - comfortably above the human hearing range while allowing for a reasonable 1980 data rate).
Upsampling and correspondingly upconverting bits per sample can help with audio quality because less aggressive cutoff filters can then be used. High order low pass filters used to filter standard 44.1kHz audio can be detrimental to audio quality compared to sampling the data stream at say 88.2kHz or 176.4kHz, which would interpolate every second or every three out of four bits based on the original data. The bandwidth of this signal would then be much higher and less aggressive filters could then be used. But, you're still interpolating, and converting 44.1kHz to 96kHz is a tricky and not very transparent operation like 88.2 or 176.4 would be. How do you interpolate every 96/44.1 = 2.1768707... bits? Actually I wouldn't be surprised at all if the results were worse than just using aggressive filtering at standard 44.1kHz, 16b/sample CD quality.
There is a 88.2kHz intermediary in between. I suspect that when set to 96kHz, it wouldn't just convert anything to 96kHz. If they would just convert it to 88.2kHz, then as you said, it should be the better choice.
In fact, it's limited to 96kHz only in Mac OSX. Under Windows, the driver support also goes up to 176.4kHz and 192kHz.
I have two theories for the missing higher frequency. The first being that perhaps the chip was really capable of 192kHz, but Apple shows only 96kHz as a means of saying that it's actually 192kHz but heavily filtered. In that case, the 32-bit float processing capability shown under Mac is just there for show.
Another being that the 32-bit float capability is for real, and it can actually convert using floating point instead of integer. In which case, any arbitrary ratio, like 2.1768707... like you said should work just fine. Single-precision floating point (32-bit) can go up to 23-24 digits, so the precision should be quite high. Of course, assuming the DSP is really 32-bit, which as I read is not too far out.
Not that any of this really matters that much since it's going through a cheap audio I/O chip and cheap output jack, but the point is that bigger numbers in the audio world does not necessarily imply better. Actually I'm sure all the chips you mentioned above perform just fine. What really matters is the part after the DAC - the analog amplifier circuit. High quality opamps and well specced analog components are much more important, along with good noise shielding and optimal placement within the chassis.
And as mentioned, the Cirrus Logic 4206a is a chip that you don't see in just about every laptop out there, so its capability is not really directly comparable to every Realtek chip. I wouldn't call myself an audiophile per se, but boosting the sampling rate to 96kHz under Mac OSX does give much higher clarity and separation to the sound going through my ATH-M50 headphones, and to my dad's ancient but still kicking Pioneer setup.
I do realize that there are Windows laptops with the same capability. Heck, my dad's Vaio also does 192kHz. But it doesn't sound as nice to my ears. It could be just me, and do take note that I'm not saying that Apple added any special sauce. That's what you and vbuggy decided. I simply stated that it sounded better than most PC laptops/
Perhaps it's misleading to say that 96kHz on Mac sounds better than 96kHz on Windows. For that, I apologize. But there is a difference, and whether you believe it or not is really up to you, I guess. The fact is that the audio chip is different, and I don't think you can just group it together with the usual laptop suspects.