October 18, 2011
The Holy Grail of AV Testing, and Why It Will Never Be Found
So, my expectations were fulfilled. My recent post on an AV performance test caused more than a bit of a stir. But that stir was not so much on the blog but in and around the anti-malware industry.
In short, it worked – since the facts of the matter are now out in the open and being actively discussed. But that’s not all: let’s hope it won’t just stimulate discussion, but also bring the much-needed change in the way AV tests are done, which is years overdue, and is also what I’ve been “campaigning” for for years.
So, how should AV be tested?
Well, first, to avoid insults, overreaction and misplaced criticism, let me just say that I’m not here to tell testers how to do their job in a certain way so that our products come out top – to have them use our special recipe which we know we’re better than everyone else at. No, I’m not doing that, and anyway, it’s rare when we don’t figure in the top-three in different tests, so, like, why would I want to?
Second – what I’ll be talking about here isn’t something I’ve made up, but based on the established industry standards – those of AMTSO (the Anti-Malware Testing Standards Organization), on the board of which sit representatives of practically all the leading AV vendors and various authoritative experts.