Tag Archives: awards

Pleasant News from China.

Privyet all!

I’m lying low in MOW at the mo, but that doesn’t mean life comes to a standstill – far from it!

While I sit here in my office looking out the window at the falling snow, over in China, in the city of Wuzhen, the annual World Internet Conference is taking place (which I was at last year). And this year the organizers have decided to give awards to the best (in their opinion) cyber-projects. And guess who featured among the winners?!

Here’s congratulating all project members! Our solution for protecting industrial installations and critical infrastructure – KICS – won the award for ‘World Leading Internet Scientific and Technological Achievements’, alongside Tesla, IBM Watson and Alibaba!

The contest was entered by 500 companies, and we were in among the 15 winners – and the only one from the IT security field.

Best test scores – the fifth year running!

Quicker, more reliable, more techy, and of course the most modest…

… Yep, you guessed it, that’ll be us folks – YET AGAIN!

We’ve just been awarded Product of the Year once more by independent Austrian test lab AV-Comparatives. Scoring top @ AV-C is becoming a yearly January tradition: 2011201220132014, and now 2015! Hurray!

year award 2015 product of the year_CS6

Image00002

Now for a bit about how they determine the winner…

Read on: Five main criteria…

Flickr photostream

Instagram photostream

Yerevan lectures.

Ladies and gents!

I’m never one to blow too hard on the proverbial own trumpet, but I just have to play you this little bit of proverbial Miles-inspired jazz fusion.

Last week (November 17) I was awarded the Armenian President’s 2015 IT Award for Global Outstanding Contributions in the Field of Information Technology!

14460_b

Read on: Lecturing hard…

Enter your email address to subscribe to this blog

AV boost: exorcising the system-straining ghost.

Around the turn of the century we released the LEAST successful version of our antivirus products – EVER! I don’t mind admitting it: it was a mega-fail – overall. Curiously, the version also happened to be mega-powerful too when it came to protection against malware, and had settings galore and all sorts of other bells and whistles. The one thing that let it down though was that it was large and slow and cumbersome, particularly when compared with our previous versions.

I could play the subjunctiveness game here and start asking obvious questions like ‘who was to blame?’, ‘what should have been done differently?’, etc., but I’m not going to do that (I’ll just mention in passing that we made some very serious HR decisions back then). I could play ‘what if’: who knows how different we as a company would be now if it wasn’t for that foul-up? Best though I think is to simply state how we realized we’d made a mistake, went back to the drawing board, and made sure our next version was way ahead of the competiton on EVERYTHING. Indeed, it was the engine that pushed us into domination in global antivirus retail sales, where our share continues to grow.

That’s right, our post-fail new products were ahead of everybody else’s by miles, including on performance, aka efficiency, aka how much system resources get used up during a scan. But still that… stench of sluggishness pursued us for years. Well, frankly, the smelliness is still giving us some trouble today. Memories are long, and they often don’t listen to new facts :). Also, back then our competitors put a lot of effort into trolling us – and still try to do so. Perhaps that’s because there’s nothing else – real nor current – to troll us for :).

Now though, here… time for some well-overdue spring cleaning. It’s time to clear up all the nonsense that’s accumulated over the years re our products’s efficiency once and for all…

Righty. Here are the results of recent antivirus product performance tests. Nothing but facts from a few respected testing labs – and it’s great food for thought. Have a look at the other vendors’ results, compare, and draw your own conclusions:

1. AVTest.org

I’ve said many times that if you want to get the truly objective picture, you need to look at the broadest possible range of tests from the longest possible historical perspective. There are notorious cases of certain vendors submitting ‘cranked up’ versions optimized for specific tests to test labs instead of the regular ‘working’ versions you get in the shops

The guys from the Magdeburg lab have done one heck of a job in analyzing the results achieved by 23 antivirus products during the past year (01/2014 – 01/2015) to determine how much each product slowed the computer down.

avtestorg

No comment!

Read on: a valuable advice to assess test results…

Independent AV testing in 2014: interesting results!

At KL we’re always at it. Improving ourselves, that is. Our research, our development, our products, our partnerships, our… yes – all that. But for us all to keep improving – and in the right direction – we all need to work toward one overarching goal, or mission. Enter the mission statement…

Ours is saving the world from cyber-menaces of all types. But how well do we do this? After all, a lot, if not all AV vendors have similar mission statements. So what we and – more importantly – the user needs to know is precisely how well we perform in fulfilling our mission – compared to all the rest…

To do this, various metrics are used. And one of the most important is the expert testing of the quality of products and technologies by different independent testing labs. It’s simple really: the better the result on this or that – or all – criteria, the better our tech is at combatting cyber-disease – to objectively better save the world :).

Thing is, out of all the hundreds of tests by the many independent testing centers around the world, which should be used? I mean, how can all the data be sorted and refined to leave hard, meaningful – and easy to understand and compare – results? There’s also the problem of there being not only hundreds of testing labs but also hundreds of AV vendors so, again, how can it all be sieved – to remove the chaff from the wheat and to then compare just the best wheat? There’s one more problem (it’s actually not that complex, I promise – you’ll see:) – that of biased or selective test results, which don’t give the full picture – the stuff of advertising and marketing since year dot.

Well guess what. Some years back we devised the following simple formula for accessible, accurate, honest AV evaluation: the Top-3 Rating Matrix!.

So how’s it work?

First, we need to make sure we include the results of all well-known and respected, fully independent test labs in their comparative anti-malware protection investigations over the given period of time.

Second, we need to include all the different types of tests of the chosen key testers – and on all participating vendors.

Third, we need to take into account (i) the total number of tests in which each vendor took part; (ii) the % of ‘gold medals’; and (iii) the % of top-3 places.

What we get is simplicity, transparency, meaningful sifting, and no skewed ‘test marketing’ (alas, there is such a thing). Of course it would be possible to add into the matrix another, say, 25,000 parameters – just for that extra 0.025% of objectivity, but that would only be for the satisfaction of technological narcissists and other geek-nerds, and we’d definitely lose the average user… and maybe the not-so-average one too.

To summarize: we take a specific period, take into account all the tests of all the best test labs (on all the main vendors), and don’t miss a thing (like poor results in this or that test) – and that goes for KL of course too.

All righty. Theory over. Now let’s apply that methodology to the real world; specifically – the real world in 2014.

First, a few tech details and disclaimers for those of the geeky-nerdy persuasion:

  • Considered in 2014 were the comparative studies of eight independent testing labs (with: years of experience, the requisite technological set-up (I saw some for myself), outstanding industry coverage – both of the vendors and of the different protective technologies, and full membership of AMTSO) : AV-Comparatives, AV-Test, Anti-malware, Dennis Technology Labs, MRG EFFITAS, NSS Labs, PC Security Labs and Virus Bulletin. A detailed explanation of the methodology – in this video and in this document.
  • Only vendors taking part in 35% or more of the labs’ tests were taken into account. Otherwise it would be possible to get a ‘winner’ that did well in just a few tests, but which wouldn’t have done well consistently over many tests – if it had taken part in them (so here’s where we filter out the faux-test marketing).

Soooo… analyzing the results of the tests in 2014, we get……..

….Drums roll….

….mouths are cupped….

….breath is bated….

……..we get this!:

Independent testing 2014:  the results

Read on: Are all washing powder brands the same?…

Doctor Doctor.

բարեւ բոլորին!

// Not sure if Google translated ‘Hello everybody’ into Armenian correctly. This is just to flag that I was in this exotic (for most readers) country, as usual for a nice mix of business and pleasure – both covered below.

Last week I had the honor of receiving a prestigious academic award from the State Engineering University of Armenia, which awarded me an honorary doctorate! Namely, ‘for an outstanding contribution in the field of information security’, and handed to me by the uni’s rector.

honorary_doctorate_eugene_kaspersky1KL/SEUA backgammon!

Hurray! And thank you!

This makes me a doctor in two countries! I’m now a ‘British-Armenian academic’, as some scoffed :) (my first doctorate was from Plymouth Uni).

Oops, beg your pardon – the above pic was a bit of fun. Here come the ‘proper’ photos…

Read on: proper photos тв Armenian landscapes…

Magdeburg: AVant garde.

There’s a Russian saying that translates roughly something like ‘live a century, you’ll be amazed for a century’. Meaning, I reckon, that when you think you’ve seen it all, you in fact won’t have. For me, this applied to the trip I made to the city of Magdeburg recently, for it did just that – amazed.

On the whole the place is a little dull and provincial (in my opinion, that is; but then again – I do live in Moscow most of the year :). There’s the river (the Elbe, but here it’s still quite meager), the impressive banks thereof, the equally impressive walls of the castle (restored) and the gothic cathedral. There’s not a great deal besides that. Apart from one feature that makes up for all that dullness…

In the center of the city there’s a totally incongruent large residential/commercial building known as the Green Citadel of Magdeburg. Just check out the colors, shapes and patterns! You seen anything quite like it?

The artist responsible for this architectural aberration is Friedensreich Hundertwasser, a Gaudi for the late 20th century. This is just one of the many buildings he transformed into a masterpiece across central Europe – in his totally original and mind-blowing style.

This Austrian was a true maverick, so I’m a fan for sure. He believed that folks shouldn’t live in box-like houses that are all the same, and that inhabitants should be encouraged to paint or in some other way change the walls around them. And that meant interior walls too. He was also into converting disused factories into avant garde pieces of art.

Enough words. Now for some pix:

magdeburg-1

More: What were we doing here in the first place?