What should we draw from AV detection rate test findings?

By | January 13, 2012

Posted in: Network Security Trends

Testing desktop antivirus products has always been tricky, attempting to simulate the real-world possibilities of all the type of malware and all the interesting and exciting ways it can be introduced onto a client system in a test lab. Today, I think, even well-crafted tests can at best offer some basic guidance for enterprises, rather than clear best-of-breed results.

Thus, the results posted in the 2011 annual Summary Report from AV-Comparative.org should be reviewed in that context. In fact, AV-Comparatives makes the clear point that enterprises need to evaluate products in-house and shop for the features and capabilities they deem most important for their organizations. I definitely recommend it as a resource if you are thinking about switching vendors or confirm the efficacy of your current vendor.

Kaspersky emerges as “Product of the year” based on the testing of products from 20 vendors , with “Advanced+” ratings across the board in all tests, which were conducted at different times throughout 2011. Among other tests, AV-Comparative included tests for detection rate without registering too many false positives, and their ability to detect new and unknown malware.

That’s a good place to pause. AV product detection rates of known malware have always been high, so it’s no surprise that the detection percentage of top performers  in this test scored in the high 90s. The not so good news is that the top performers’ detection of new or unknown malware runs at perhaps 60% tops in the AV-Comparative tests, taking false positives into account. Testing in this area is very difficult, and results of different tests, in different labs using different methodologies produce different results. The common thread is generally that AV products miss a lot of malware.

In the good old days, this was not so much of an issue. Six, perhaps even years ago, when Ed Skoudis of InGuardians conducted a group review of business-caliber AV products (I think, something like 15) for Information Security magazine, we made a conscious decision not to rate them on number of signatures. Not everyone agreed with this approach, but our feeling at that time was AV signature were something of a commodity. Everyone had about the same signatures, and the question of whether Vendor A got one six hours before Vendor B, this time, or Vendor C got one first the next time was more noise than substance.

So Ed focused on things like central management, ease of use, ease of updating, performance etc. Interestingly, we were curious about spyware, which was just beginning to emerge as a serious security concern at that time, prompting a fistful of small consumer-oriented vendors to grow their tools quickly into business-grade products. Of the products tested, only Pest Patrol (many of us remember the creepy bugs crawling across our screens when we navigated to their website) detected most of the samples thrown at them.

The rest were generally pathetic, with results like 20% or less. Traditional AV vendors scrambled to catch up to the new threats, but it took some time, and the spyware catchers thrived in the meantime. I recall talking to the CISO of a large pharmaceutical company who lamented having to put Webroot on his company’s PCs in addition to Symantec because he had no choice at that time.

That marked a sea change, far more than most of us imagined. Malware was getting far more varied, more insidious and hard to detect, and, perhaps most important, was being delivered with criminal intent. Signature-based detection of known malware, while still necessary, has become less significant. Malware writers have gotten really clever at hiding and morphing their attacks, and the sheer number of unique samples, in the tens of millions now annually, is daunting if not overwhelming. AV vendors employ better heuristics, anomaly detection, host-based IPS and, increasingly, reputation-based (especially file-based reputation) detection methods in an effort to come close to keeping up. High detection percentages seem less impressive in the face of the hordes of malware.

Ironically, in a recent audio cast for SecureSMB online, I asked Ed Skoudis if there was really much difference among the established AV vendors as far as detection is concerned. He gave a bit of a sigh and said, no, not really, that they might leap-frog each other at times, but detection rates were not the bottom line in making your selection. So, the lesson is now, as it was then, to look for the features that are most important for your organization, and decide if they justify switching vendors. If one solution is cheaper, for example, or has more robust central management is it worth the time and effort required to rip and replace? Is one AV suite more desirable because of how it fits with the vendor’s other security offerings deployed in my enterprise? Is performance (one of the tests conducted by AV-Comparative) on PCs an important consideration for your end users’ productivity?

And, above all, bear in mind:


  • You need other detection tools, including network-based monitoring analysis and IPS to work in concert with desktop AV.

  • YOU WILL BE BREACHED. Have the tools and procedures in place to detect the bad stuff that is already on your network and respond accordingly.



  •  
  •  

You May Also Be Interested In: