I don't know a lot of things, but I do know a little bit on operating systems.
Look comparing the numbers of Certs is pretty much useless, exept that it makes for good propaganda. Most certs for linux are based on stuff like programming buffer overflows and things like that, things that COULD cause problems. And patches are usually supplied or the certs are about obsolete versions of services that should of been replaced long ago. If you notice a Cert on windows tends to be more serious because in order for the admin community to become wise to the problem it means that some crackers have broken into servers using those exploits, because hell would freeze over before MS actually admits to a problem, the closest MS would admit to a problem before it becomes a issue is to give out a patch #600438231354 or whatever in the next round of updates. If there is a problem with a common version of Openssh that was release 3 years ago, and it wasn't discovered until the end of this month, You know it's gonna be plastered everywhere by it's own developers if nothing else.
Another reason comparing numbers of Certs is bad is because the nature of developement of a linux OS. There is realy no "native" application that comes with "linux" like there is with Windows. In Linux everything is 3rd party, if you use debian you have something like 4 different ftp servers to choose from, you have 2 ssh servers, you have maybe 3 or 4 different types of webservers etc etc. If you take a look at Redhat, probably less then 3-5% of the code you get from a Redhat distro was actually created by RedHat. So to compare the number of vunerabilities of Windows and Linux accurately you would have to not only compare the stuff that comes on a window's CD, but all 3rd party software that would be COMMONLY used by windows servers. And that's not even fair to Linux or Windows, it's just another reason why numbers of Certs alone is pointless.
Another thing that's bad is how people say that closed source software is more secure is because crackers can examine open sourced code for mistakes and they can't do that with W2k for example. Which is true, exept that in the real world it's not. With open source you have a couple advantages, the major one of course is more "eyes" you have examining a code the better of it will be. OpenBSD is a good example of this. You can actually make it realy hard to crack programs, just by eliminating as many mistakes as possible. Oh, and windows tends to loose it's advantage of closed source, because, well, the codes been leaked. Crackers have the code to actually look over, unlike us law-abiding citizens. Closed source being secure by nature is akin to saying if you make guns illigal it will make it harder to commit gun crime. It actually makes murder/violence/theft/rape (etc..) easier. It just reasures the criminal that their victims will be unarmed, and makes the illigal trafficing of undocumented weapons profitable.
I am not saying that by default the developement of Linux leads to more secure OS. All we have to point to is acendotal and historical evidence that Linux/Unix is generally more secure than Windows. Which is with no doubt, true, by that evidence. Weither it is due to limited numbers of available servers or due to the level of compentance between different types of Admins, it is fairly pointless in that context.
What we really need is a conclusive long-term study. You would need to set up server tests to figure out conclusively.
my example would be:
1. Several box's set up and a prize is sent to whoever cracks it first. You would have to monitor the number of attacks, and the succsess rate. You would have to set up different classes of boxes. Many combinations of the following. Boxes set up to provide a max number of services to the internet, boxes set up to provide domain services and such for lans, but are behind "faulty" firewalls. Set up autonimous systems that are to expect to behave alone in the wild. Set up a infrestructure were you have a "enviroment" based on a network of certain OS's. Set them up as default instalations, installations by a "bad" admin and installations set up by "good" admins. etc etc. Mix and match that sort of thing.
2. Set up honeypots in the wild and don't tell anyone until the study is over and see how people go about trying to comprimise them.
And I realy doubt if anyone is going to do that study, it would be very expensive and no company is THAT sure of it's product to sponsor that sort of thing. It could backfire horribly.
Oh and as far as window's "easy use" nature. It's probably because most admins have been using MS products for at least 5 years or more. Stick a mac user infront of them and laugh at how long it takes them to figure out the left-click thing. If a admin has been using Unix/Linux for for 5 years and you set them in front of a windows interface for a couple days you ain't gonna hear about easy usage at all.
Maybe my grandpa couldn't use Linux, and all he can do double click on the at&t icon to get on the internet in his windows box. However all his computer stories always end up him going "well it froze up and all these windows we open so I just go like this!" He proudly announces as he motions with 2 fingers on his left hand and one finger on his right hand on the table "SO i did that a couple times and the damn thing just went black and blue, So i just kicked at the red light on the power-thingy and did a re-kick". Only damn thing he ever realy learned to do well.