PCWorld magazine did a comprehensive "roundup" of 10 A-V programs in their March 2006 issue. They tested performance in a wide variety of categories, including detection rates for Wild List viruses, "zoo" threats, and heuristic detection capabilities with one-month-old signatures and two-month-old signatures.
The programs tested were:
1) BitDefender 9 Standard Edition
2) McAfee Virus Scan 2006
3) Kaspersky A-V 5.0 (the new 6.0 version is now available)
4) F-Secure A-V 2006
5) Symantec (Norton) A-V 2006
6) Panda Titanium 2006 A-V plus anti-spyware
7) AntiVir Personal Edition Classic 6.32 (
free)
8) Avast Home Edition 4.6 (
free)
9) Trend Micro PC-cillin Internet Security 2006
10) Grisoft AVG
Free Edition 7.1
I would have liked them to have tested NOD32, but for some reason they didn't.
To summarize their findings, the Top 4 performers overall were:
1) BitDefender
2) McAfee
3) Kaspersky
4) F-Secure
All 10 programs had 100% detection rates with Wild List viruses. With zoo threats, however, they ranged between a low of 76% (PC-cillin) and a high of 100% (Kaspersky). Of note, F-Secure had a 97% zoo threat detection rate, and BitDefender 9 had a 95% zoo threat detection rate. All of these are excellent. By contrast, Avast (86%), Panda (86%) and AVG (80%) all had notably poorer zoo threat detection rates. The free AntiVir, however, scored 95% with zoo threats.
The biggest difference among A-V programs these days seems to be with their heuristic abilities. Among the top four performers I listed above, all performed fairly well with one-month-old signatures: BitDefender scored 56%, McAfee A-V 2006 scored 53%, Kaspersky scored 51%, and F-Secure scored 52%.

The remaining six programs all fared very poorly with heuristics w/one-month-old signatures. Scores ranged from a low of 6% (PC-cillin) to a high of 22% (Norton). The free AVG program scored 8%, the free Avast scored 9%, and the free AntiVir scored 11%.
In other words, the programs with low heuristic detection rates are nearly worthless when it comes to detecting new malware that the companies haven't created a signature for yet. BitDefender, McAfee, Kaspersky and F-Secure, however, are quite good at that, with each catching slightly more than half of new malware for which specific definitions hadn't been created yet.
(NOTE: While I have no data to back this up, I would guess that NOD32 would also score well with heuristics, as they are known for their excellence in that area.)
As far as heuristics with two-month-old signatures, predictably the detection rates drop quite a bit. The top four again performed the best, with detection rates ranging from a low of 26% (Kaspersky) to a high of 38% (BitDefender). The rest of the programs performed extremely poorly in this area, with scores ranging from 3% (PC-cillin) to 16% (Panda Titanium). The three free programs ranged from 4% to 6% heuristics detection rates w/two-month-old signatures -- in other words, pretty worthless.
One thing that surprised me was the lack of testing for rootkits. Since rootkits have now become the malware-du-jour of the cretins who propagate this stuff, a program's ability to detect/repair rootkits should be a significant consideration. AFAIK, as of this writing I think there are only two A-V programs that scan for rootkits: F-Secure and Kaspersky. More will likely follow, but right now, those are the only two I know of. SpySweeper also claims to scan for rootkits, but it's an anti-spyware program, not a full-fledged A-V program.
As far as the free programs go, AVG had the poorest detection rates in
PCWorld's test and AntiVir had the best (although even it scores poorly compared to the top programs).
As we all know, different tests by different labs always yield different results. So take these results for whatever you think they're worth. Reading the description of how they tested, I think their results here are likely pretty credible. In addition to the detection rates tested by
PCWorld, there are other things to take into consideration as well:
footprint, the ability to scan for rootkits, user-friendliness, scanning speed, response time for creation & release of new definitions, design/interface, tech support, and cost. If you're like me, you read a number of different test reports & reviews, then choose accordingly.
