One step forward, two steps back.
“Everything ought to happen slowly, and out of joint, so we don’t get above ourselves, so we remain miserable and confused”
I never thought I’d ever use this phrase when talking about the antivirus industry, but that’s what it’s come to. You know, not everything in this world progresses smoothly. Economic realities and the need for new customers often manage to lure even the best over to the dark side. This time, one of the best-known test labs in the AV industry – AV-TEST – has succumbed.
Comparative testing: A bit of background for the uninitiated
How do you go about picking the best of any particular product? And how do you know it’s the best? Well, you would probably start by looking at the results of comparative testing in a specialist magazine, or the online equivalent. I’m sure this is not news to you. The same goes for AV solutions – there are a number of test labs that evaluate and compare a huge variety of antivirus products and then publish the results.
Now, for some unknown reason (below I’ll try and guess why exactly) the renowned German test lab AV-TEST has quietly (there was no warning) modified its certification process. The changes mean that the certificates produced by the new rules are, to put it mildly, pretty useless for evaluating the merits of different AV products.
Yes, that’s right. I officially declare that AV-TEST certification of AV solutions for home users no longer allows product quality to be compared adequately. In other words, I strongly recommended not using their certificate listings as a guide when choosing a solution to protect your home PC. It would be natural to believe that two products that both have the same certification must be equal (or close to equal) in performance. With AV-TEST’s new certification standards, the onus is on the user to carefully investigate the actual results of each individual test…they may find that a product that blocked 99.9% of attacks has the same “certification” as a product that only blocked 55%.
Now let’s take a closer look at what happened and why – or a crash course in interpreting AV-TEST results.
The formula for the ideal AV solution was thought up a long time ago. It goes something like this:
- 100% protection and 0% false positives.
- Zero impact on system resources.
- And no questions to the user.
(4. And, if we want to get into the realms of fantasy, then all that has to be provided absolutely free.)
Obviously, that ideal is unattainable, but we can at least aspire to come as close as possible to it and in particular to:
- Catch as many malicious programs as possible, and if something does get through – be able to treat the infection (and be able to install protection on an infected computer).
- Minimize the risk of false positives – and if they do occur, get rid of them ASAP.
- “Our integral knows no limits,” claims a good friend of mine, and there’s no limits to the work that goes into optimizing use of system memory, processor operation time, the number and size of updates via the Internet. And of course, none of that should impact on the level of security.
That all sounds very straightforward. But what does an average user do when he/she looks at dozens of antivirus products? Which is better, and why? Who can rank them in terms of how closely they come to the “ideal AV solution”? (And remember, all those products make convincing claims to being the best there is.)
So, who can we trust to tell us the truth? Independent testers, of course. And that includes AV-TEST.
A few years back the AV-TEST team created a very good method for testing products, and to earn a certificate, products needed to perform nearly flawlessly in each test category.. Products were tested on three criteria:
- PROTECTION (prevention of infections),
- REPAIR (cleaning up existing infections)
- USABILITY (ease of use, performance, and number of false positives).
A certificate is issued, or not as the case may be, based on the results (number of points accumulated). We supported this system and held it up as an example to the other testers in the “premier league” of comparative testing.
So what metamorphosis has taken place at AV-TEST? Why can we no longer trust its certification system?
First of all, the necessary criteria for obtaining a certificate have changed. The important REPAIR parameter for certification has been discarded. What’s the point of an AV solution that’s capable of detecting an infection but is incapable of treating it? (Just imagine at the dentist’s: “Oh, you’ve got tooth decay, but we won’t treat it – we can’t treat it!”) Not so long ago we found out that about 5% of all computers in the world with AV installed are infected! Every twentieth! Clearly, an AV solution’s ability to remove an active infection is of high importance to millions of people around the world.
To their credit AV-TEST has promised to create a separate, more progressive REPAIR test… but it will no longer effect product certification results, and, most importantly, it will be optional! If security vendors have doubts about the quality of their product’s treatment, they can simply opt out to avoid a poor test result. At least we can watch who does participate in the test and how they perform, which in itself will be a good indicator of how well an AV solution can protect your computer.
Secondly, the threshold for certification has been lowered – you now only have to score 10 points out of a possible 18 to receive an “award”.
Thirdly, USABILITY now only refers to false positives. There’s a world of difference between this idea of usability and usability that takes performance into consideration. Given the current makeup of AV-TEST’s test criteria, false positives might just as well be included in the PROTECTION category to act as a counterbalance (something that most other test labs do anyway).
So, what are the consequences of this sudden change to AV-TEST’s certification process?
First and foremost, their certification will be seriously devalued. The number of test participants will increase – some AV vendors shied away from taking part in testing because they knew their level of protection had no chance of getting a certificate. Now just about everyone will have awards from AV-TEST, with only the most idle or inept being left out.
In fact, based on these lowered criteria, it’s reasonable to conclude that a basic AV program could achieve “certification” from AV-Test. And why not? There’s no need to look for new malware – all you need to do is monitor the flow from pubic online multi-engine scanners like VirusTotal. There’s no need to analyze anything either – simply set a multi-scanner and “detect” files that others have already detected (and detect it using MD5 to make sure there are no false positives). Then design an interface, add a mini-updater, throw in a couple of Windows functions to simulate continuous protection, stick an icon in the system tray, wrap it all up in an installer and bingo! Send it off to AV-TEST and await your certificate!
Basically, the balance between evaluating security technology and “usability” has been lost. The individual test results are still valid and will continued to be followed closely by security industry wonks. But the lowered threshold for AV-TEST certification will, unfortunately, mean this certification is mostly useless to the average consumer trying to make an informed decision when purchasing an AV solution.
And there you have it… Now for the next part.
The question is: why the hell did AV-TEST do it? Why make things worse for themselves and others?
Little is known about the real reasons (AV-TEST has made almost no comment about the changes), but we can try to figure it out for ourselves. And first we need to look at the economics of the testing business.
Yes, testing is a business with its own economy. I understand that very well. Performing a good test is not just a matter of brains – it requires investment in infrastructure, office space and salaries. And just like many businesses, there is a correlation between quality and profit. Sometimes companies consciously lower the quality to increase profit. In the short term that approach can really pay off. But in the long term it leads to degradation and oblivion.
Could this be the case here? The hardest test – treating an infected system (REPAIR) – has been removed from the “mandatory program”. Now all you need to do is test products against a collection of malware and clean files and… voila!
Yes, the new procedure makes the testing process will likely bring new clients to AV-TEST, who will gladly dish out their “medals” so that every antivirus website has one. But the test lab will lose it uniqueness as well as the trust of the more technically minded AV vendors which the entire industry relies on.
I’m certainly not condemning the desire to earn more $$$$$! With the right priorities in place it is a good measure of a business’s success and the quality of its products and services. Of course, it helps to enhance quality even more and, to everyone’s benefit, earn even more :) The testing industry is no exception. Lots of companies in this sphere are going down the road of diversification, entering new test niches (AV-TEST has also been involved), digging not only deeper but also wider, improving their professional karma. Maybe chasing a couple of extra cents will lead to a descent to the mainstream, devaluing their verdicts to the level of “certificates for all” at a fair price.
The next logical question is – why are our products still in the AV-TEST system?
First of all, it is our principled position that the more tests there are, the more objective the assessments. We are not afraid of anything. We have confidence in our technology and our level of protection, and if we have any complaints about one test or another, we say so directly and publicly.
Moreover, AV-TEST has lots of other useful tests and certificates, including in the corporate and mobile product sectors. The treatment function is of less importance for corporate clients than it is for home users. Even in small companies there is usually a sysadmin and backup copying, which together can beat an infection if an AV product can’t.
To sum up
- With this new certification process, we do NOT RECOMMEND users take the AV-TEST certification into consideration when selecting a solution to protect their home computers.
- However, we believe it is OK to take into consideration the results of the separate PROTECTION and PERFORMANCE categories and, of course, the promised REPAIR test. Again, we don’t disagree with the methodology of their individual tests…we are taking issue with what scores are required in these tests to achieve “certification.”
- It is in the interests of AV-TEST to listen to the views of AV industry experts in order to create a truly adequate certification system that helps users make an informed decision when choosing a product. This involves, for example, returning to AMTSO (Anti-Malware Testing Standards Organization) and discussing points of contention with the representatives of virtually all the leading vendors and industry experts.