Printers

Vendors question open source AntiVirus results

Since publishing the results of the AntiVirus fight club, organisers from Untangle have been met with a storm of queries, criticisms and complaints about their methodology and the accuracy of the reports.

Results from a presentation focusing on the accuracy of AntiVirus software are at the centre of controversy concerning the methodology used for testing.

Since publishing the results of the AntiVirus fight club at the recent LinuxWorld conference, organisers from Untangle have been met with a storm of queries, criticisms and complaints about their methodology and the accuracy of the reports. The AntiVirus fight club has become a magnet for controversy by AntiVirus vendors and security researchers.

The list of stated problems begins with the absence of their chosen anti-virus scanner, popular choices being Avast!, NOD32, BitDefender, TrendMicro, Panda, and AVG. With so many popular anti-virus solutions left out, many are wondering how useful the fight club results are for determining the relative effectiveness of anti virus solutions. One contributor on the blog results on the Untangle Web site, SandyJ, said: "Based on what I have seen so far I have little faith in your work as a reasonable measure of AV software."

A second issue involves the remarkably poor results of the Watchguard gateway, particularly as it uses the same engine as ClamAV which was highly recommended by the organisers. The disparity led some observers to wonder if the virus scanning module of Watchguard was enabled in the first place. A spokesperson from Watchguard contacted Builder AU and had the following to say:

"WatchGuard contends that these results are not valid. We just can't understand how these test results came out the way they did. WatchGuard simply wouldn't be in business if these results were legit. Thankfully, however, the proof that they are not valid is right there in the results. WatchGuard uses ClamAV for their AntiViral engine. As you saw, ClamAV got a near perfect score in the test. In order to get the results that they did, ClamAV would have to be turned off in the WatchGuard product."

Many security professionals commenting on the Untangle blog criticised the test for its small sample size. The AntiVirus shoot out tested virus scanners against only 18 different viruses, each pulled from the organiser's inbox. For comparison, another virus shootout from PC Magazine tested virus scanners against more than 600,000 viruses, worms and trojans.

David Harley, from UK security consultancy firm Small Blue-Green World said via comment on the untangle blog:

"Let's assume (purely for the sake of argument) that all your 18 samples are valid, ItW [In the World] viruses (clearly they're not, but bear with me) and that your methodology is perfect (ahem.) What have you proved? That some scanners catch more of your samples than others. How many viruses are ItW at this moment? According to the latest WildList, 525. What, on the basis of your test, can you tell us about how well each of those scanners performed on the other 507? Nothing whatsoever..."

Dirk Morris, the organiser of the fight club did not respond to our queries, however, he said this as a follow up on the competition:

"Based on some responses I received I think there is some confusion about the purpose of the demo. The fightclub was NOT a antivirus shootout... We didn't test zero-day viruses. It wasn't a coverage test (we only had a few samples). We didn't do a functionality comparison. There are many important aspects of an antivirus product that we didn't even touch or mention. The goal of the fightclub was to promote discussion around open and transparent testing."

The results of the fight club can be found on the untangle blog. Ten AntiVirus applications were tested on three categories of viruses, including EICAR test files, items chosen from the organisers e-mail inboxes and submitted by observers.

Editor's Picks

Free Newsletters, In your Inbox