Again, in a well designed system that's not a problem as the new version of X won't be installable until Y is available too. If the libraries themselves changed and introduced a compatibility problem, you can't avoid that even on Windows.
Maybe I'm not making myself clear, you sound like a pretty experienced Linux user. From what you are saying, yes at the LEVEL of the OS, things aren't typically going a muck, but at times they do.
At the small application level too I have seen this happen all the time, no offense but it doesn't sound like you are using that much software in Linux.
Looks lets make sure we are talking about and including ALL software from A-Z that is available for Linux. I'm not just talking the major mainstream here, but any and everything that is out there for Linux.
Oh please tell me when that's actually happened to you. 99% of my apps are GTK and I have never had a problem with dependencies, incompatibilities or anything relating to library changes. There's no doubt it's possible, but it's also possible that you'll die in a fiery car crash tomorrow and that doesn't stop you from leaving your house, right?
I can tell you just last month I had problems with dependancies in Gentoo, dealing with Acidrip.
That is the point of a distribution, they maintain the packages for you. They build and test them before they're uploaded to the servers for everyone else to use and most have an intermediary server too so that more people can test them before they're considered good and pushed out to everyone else.
Ok WAIT, hehe this is where we fouled up in this conversation, not every distro out there can maintain every piece of software available for Linux, some do maintain more then others, but even the big names sometimes don't even keep up with some names that are even considered big. So then end-users have to go out there and find like a rpm from somewhere else and then install this.
But I have seen time and time again packages maintained by even distros as big as Ubuntu that ran like crap.
Did you know, and they might still even be doing it, that Ubuntu use to compile all their package for the Pentium CPU. You know what that means for people that use AMD cpus?
Really poor performance, example I have a AMD XP 3000+, now fast enough for Mplayer, but because Ubuntu did this, Mplayer would not run at all for me.
Also I remember last year MPlayer and Transcode was lacking support in Ubuntu, meaning to get things working for it, you had to go outside the "Stable" tree to download it and use them, oh no more issues here with unstable packages.
Once again, IME that almost never happens. That's called a regression and if it does happen it's a bug and should be fixed before the package makes it to the end users. The only thing that's even remotely problematic is the closed source nVidia drivers and that's because I go off on my own and don't use the prebuilt packages in my distribution, if I did I wouldn't have to worry about them either.
We are talking bugs here for many things, hardware support, compile flags compiled into the software for certain functions and hardware compatibility.
No developer can build a program to encounter all the likely situations with hardware, software, etc.. that one runs on their box. This is the reality of software on any operating system.
Then I would say that things have changed since you ran that site, either that or those particular libraries were crap and probably aren't being used any more.
Nothing has changed, bugs are bugs and as long as people code mistakes and are not perfect we will have problems. What makes you think we live in a bug free world in Linux?
Actually I would say the big names have the biggest chance for problems, especially things like the XFree86->Xorg transition and just about any current Linux kernel update since development is moving so fast. And actually udev has been a bit of a problem child, but Linus has banged it into gregkh's head that backwards compatibility is important and I doubt udev will be much of a problem any more. But because the Debian X maintainers took their time and did it right, I barely noticed the transition to Xorg. Infact I had to go look at the package versions to believe that it had happened. And that's how it's supposed to work, it's nice to have competent people behind the scenes.
Big or little it doesn't matter they all face issues. I've seen it from the biggest to the smallest.
Look I'm not pulling your leg here, I've used and use Linux as a profession and my level of experience is extremely high.
I think you might be looking at this from one side, because all the things I am saying are correct and happen to people on a daily basis.
If you are a Linux user and lover of the OS, you should hang out in some of the big channels on freenode, #debian, #gentoo, #ubuntu, just to name a few and see all the problems people face that have I been saying.
I know for a fact because I have dealt with this as a software packager for Slackware for 7 years.
ALOHA
For the sake of all arguments, you and many others might not ever face, or hardly see problems, but then there are those that will, this is a fact.
So for all those that don't have problems, guess again, there are that many more that do, and always will, that's why they are called bugs, some get them, some don't.
I always seem to have to explain this concept of bugs to people. 
ALOHA