Not all AV tools are created equal: Uproar from AV vendors kicks off round two

SecurityIt appears that round two of Untangle's antivirus "fight club," which started in Verdict of "live" test: Not all antivirus tools are created equal, has begun.

McAfee officials and others are taking issue with the methodology of the test in which various proprietary products were pitted against one another, as well as with the open-source ClamAV at the recently concluded LinuxWorld Expo.

The main contention was that the narrow test of just 35 viruses was simply unfair when compared to the hundreds of thousands of malware out in the wild.

According to eWeek, the test consisted of three sets of viruses:

The first batch was a basic test set from that Morris [CTO of Untangle] described in a blog as a universal test set. The second set was the "in-the-wild" test of viruses picked from Morris' mailbox that he had received over the years in mass quantities, and the third group of viruses was submitted by users.

The other contention was that "there clearly was a problem with configuration," wrote David Harley, a security researcher. He also added that:

By the tester's own admission... it looks as though the products were tested pretty much "out of the box" without considering whether the conditions of the test would disadvantage specific default configurations.

In my previous posting on this "Fight Club," some of you folks asked for an independent and full-fledged source of test results. Well, it appears that AV-Test is the organization to look at. Under the maxim of "Independent, qualified, and fast," they bill themselves as the "leading company in the range of testing and analyzing antivirus software."

No test results appears to be published on its Web site — though thankfully, PC Mag has previously posted the results of a recent 'shootout' dated May 22, 2007, from AV-Test. In it, 29 antimalware products were tested against (check this out) 606,901 sets of malware. Products were tuned to their most aggressive detection options.

Here are the top 10 results. I have nothing against ClamAV, but note that ClamAV does not appear in the top few or even top 10 this time round:

Program # Detected Detection %
WebWasher 605,846 99.83%
AVK 2007 604,255 99.56%
AntiVir 603,408 99.42%
F-Secure 594,333 97.93%
Symantec 593,355 97.77%
Kaspersky 592,606 97.64%
Fortinet 589,028 97.06%
Avast! 584,574 96.32%
AVG 583,541 96.15%
Rising 582,772 96.02%

Before you hop over to view the full results, do indulge me by taking the following poll. As noted by several TechRepublic members in my earlier post, many popular AV tools were not included as an option, for which I am deeply apologetic about, despite using AVG Free edition on my personal laptop.

So as part of my penance - there you have it, 20 of them are here this time round. (Which is the max allowable that I can set!)

Now, there appears to be many complaints regarding Symantec AV products not detecting viruses that were easily detected by many other competing AV products.

This is honestly puzzling given its 97.77% detection rate in this pure detection test by an independent party. Its noted slowness aside, my gut feel could be that many malware actively scan for Symantec's AV scanner and somehow disables and/or neutralizes it.

What is your opinion of Symantec's (anecdotal) poor showing?


Stay on top of the latest tech news

Get this news story and many more by subscribing to our free IT News Digest newsletter, delivered each weekday. Automatically sign up today!


Paul Mah is a writer and blogger who lives in Singapore, where he has worked for a number of years in various capacities within the IT industry. Paul enjoys tinkering with tech gadgets, smartphones, and networking devices.

Editor's Picks