Tag Archives: awards

Online conference – Chinese style (complete with pioneering-tech superstition).

Normally, my work schedule is made up of all sorts of meetings, press interviews, taking part in exhibitions, speaking at conferences all over the globe. Normally. Not this year, darn it!

Now, some of the events I get to are one-offs. Some are regular, recurring ones (mostly annual) but to which I get only once in a while. While there are some recurring events that I deem simply must-attend. And one of my main must-attends every fall or early winter is the World Internet Conference in Wuzhen, organized by the Cyberspace Administration of China, which I’ve participated in every year (up to 2019, that is) since 2015 – just a year after it’s ‘inauguration’ a year earlier. This year, alas – no traditional trip to eastern China; however, much like here at K, not being able to be present in-person does not mean a big and important event can’t still go on. Which is great news, as this means I can still get what I want to say across to: the main players of the Chinese internet – state regulators, heads of provinces and regional development institutes, and also bosses of the Chinese big tech companies; and all from a huge screen – perhaps the biggest I’ve ever seen!

Sure, it would have been nice to be there in person – to stroll around the quaint cobbled narrow streets of the old ancient town (as old as the Tang dynasty, apparently) and take a boat ride along its canals, which indeed some folks did manage to do, somehow. But I was playing it safe. Still, the plentiful ‘in-person’ activity at the venue is at least cause for optimism during these remote-everything times.

But now for the main thing: about Wuzhen superstition…

Read on…

Patently great work.

Last month was a great month for K-intellectual property. So nice to get such good news to brighten up dull, damp, dreary March days.

But we’ve had other great months IP-wise of late too…

In September of last year, for the second year in a row we were included in the Derwent Top 100 Global Innovators listing, making us the first – and only – Russian company to get onto this meticulously researched list of the world’s 100 most innovative organizations! Hurray!

A few details about this top-100: Every year the independent U.S. company Clarivate Analytics chooses its most innovational companies in the world based on the quality of their patent portfolios. In particular, Clarivate selects its top-100 based on the following four criteria:

  1. How successful a company is with its patent applications in actually being granted patents;
  2. How global a company’s innovations are;
  3. How often a company’s patents are cited elsewhere (in applications of other IT companies); and
  4. The total number of patents a company has.

This year eight IT players made the list: Amazon, Facebook, Google, Microsoft, Oracle, Symantec, Tencent and us! Nice to be rubbing shoulders with such worthy contemporaries!

Now for an update to the numbers of our IP team, who never cease to amaze with their hard work and successful results: Our patent practice was established back in 2005; since then our patent portfolio has grown from 0 to 930+ patents obtained in Russia, the U.S., Europe, China and Japan! Besides, we have more than 500 patent applications pending; we’ve won nine court cases, two are ongoing, and we’ve lost none!

In short, we continue to fight – and beat – patent trolls. Trolls – take note!

That’s all for today folks. See you again tomorrow!…

Flickr photostream

  • Yakutsk - Tiksi - Yakutsk
  • Yakutsk - Tiksi - Yakutsk
  • Yakutsk - Tiksi - Yakutsk
  • Yakutsk - Tiksi - Yakutsk

Instagram photostream

The extraordinary things I’ve done and seen – in the year of the Lord of twenty-nineteen!

Hi folks, and – belatedly – Happy New Year!

Trust you all had happy, jolly, merry holidays. I sure did!…

All righty. Let’s get on… by looking back.

So, as per usual on these here blogpages of mine around this time of year, herewith, my round-up of last year: a summary of the facts and figures and countries and flights and tourisms and volcanoes and excursions and monasteries and walkabouts and treks and all the rest; woah – already for the sixth time (2014, 2015, 2016, 2017, 2018)!

But… why?

Well, over-self-indulgence plays a part, for sure (especially when it comes to things like my total number of flights and other stats:). However, I’m lucky enough to travel to and experience a lot of highly interesting places/stuff, which I’m fairly certain will be of interest to some of you, dear readers. And you might not have read every post of the year (there are a lot of them). Accordingly, surely, a pint-sized review of the ‘greatest hits’ of the year (including ringing in the New Year in the Ecuadorian mountains in a hot-spring swimming-pool some 3600 meters above sea level!) will be worth something? I hope so, anyway. OK, so – ~rationale out the way, let’s get on with this…

The just-mentioned hot springs in Ecuador:

Read on…

Enter your email address to subscribe to this blog

Reaping the fruits.

The year is coming to a close, and it’s only natural to sum up the various results of the last 12 months. So here’s a triple whammy of good news:

1) Our business security solution won the Gartner Peer Insights Customers’ Choice in the Endpoint Security Solutions category for the third consecutive year. This year, it scored 4.6 points out of five, based on 1,747 reviews from real users. Gartner Peer Insights is an independent platform where corporate customers can leave positive or negative feedback on the products they use and give them scores. Customer’s Choice is ranked based on the scores vendors receive from users, taking into account both the quality and quantity of feedback. Gartner ensures there are no bots or cheats. We’re constantly improving our products and refining our features. And winning the Gartner rating for the third consecutive year is the best proof that users see it and appreciate it.

Read on…

Pleasant News from China.

Privyet all!

I’m lying low in MOW at the mo, but that doesn’t mean life comes to a standstill – far from it!

While I sit here in my office looking out the window at the falling snow, over in China, in the city of Wuzhen, the annual World Internet Conference is taking place (which I was at last year). And this year the organizers have decided to give awards to the best (in their opinion) cyber-projects. And guess who featured among the winners?!

Here’s congratulating all project members! Our solution for protecting industrial installations and critical infrastructure – KICS – won the award for ‘World Leading Internet Scientific and Technological Achievements’, alongside Tesla, IBM Watson and Alibaba!

The contest was entered by 500 companies, and we were in among the 15 winners – and the only one from the IT security field.

Best test scores – the fifth year running!

Quicker, more reliable, more techy, and of course the most modest…

… Yep, you guessed it, that’ll be us folks – YET AGAIN!

We’ve just been awarded Product of the Year once more by independent Austrian test lab AV-Comparatives. Scoring top @ AV-C is becoming a yearly January tradition: 2011201220132014, and now 2015! Hurray!

year award 2015 product of the year_CS6

Image00002

Now for a bit about how they determine the winner…

Read on: Five main criteria…

Yerevan lectures.

Ladies and gents!

I’m never one to blow too hard on the proverbial own trumpet, but I just have to play you this little bit of proverbial Miles-inspired jazz fusion.

Last week (November 17) I was awarded the Armenian President’s 2015 IT Award for Global Outstanding Contributions in the Field of Information Technology!

14460_b

Read on: Lecturing hard…

AV boost: exorcising the system-straining ghost.

Around the turn of the century we released the LEAST successful version of our antivirus products – EVER! I don’t mind admitting it: it was a mega-fail – overall. Curiously, the version also happened to be mega-powerful too when it came to protection against malware, and had settings galore and all sorts of other bells and whistles. The one thing that let it down though was that it was large and slow and cumbersome, particularly when compared with our previous versions.

I could play the subjunctiveness game here and start asking obvious questions like ‘who was to blame?’, ‘what should have been done differently?’, etc., but I’m not going to do that (I’ll just mention in passing that we made some very serious HR decisions back then). I could play ‘what if’: who knows how different we as a company would be now if it wasn’t for that foul-up? Best though I think is to simply state how we realized we’d made a mistake, went back to the drawing board, and made sure our next version was way ahead of the competiton on EVERYTHING. Indeed, it was the engine that pushed us into domination in global antivirus retail sales, where our share continues to grow.

That’s right, our post-fail new products were ahead of everybody else’s by miles, including on performance, aka efficiency, aka how much system resources get used up during a scan. But still that… stench of sluggishness pursued us for years. Well, frankly, the smelliness is still giving us some trouble today. Memories are long, and they often don’t listen to new facts :). Also, back then our competitors put a lot of effort into trolling us – and still try to do so. Perhaps that’s because there’s nothing else – real nor current – to troll us for :).

Now though, here… time for some well-overdue spring cleaning. It’s time to clear up all the nonsense that’s accumulated over the years re our products’s efficiency once and for all…

Righty. Here are the results of recent antivirus product performance tests. Nothing but facts from a few respected testing labs – and it’s great food for thought. Have a look at the other vendors’ results, compare, and draw your own conclusions:

1. AVTest.org

I’ve said many times that if you want to get the truly objective picture, you need to look at the broadest possible range of tests from the longest possible historical perspective. There are notorious cases of certain vendors submitting ‘cranked up’ versions optimized for specific tests to test labs instead of the regular ‘working’ versions you get in the shops

The guys from the Magdeburg lab have done one heck of a job in analyzing the results achieved by 23 antivirus products during the past year (01/2014 – 01/2015) to determine how much each product slowed the computer down.

avtestorg

No comment!

Read on: a valuable advice to assess test results…

Independent AV testing in 2014: interesting results!

At KL we’re always at it. Improving ourselves, that is. Our research, our development, our products, our partnerships, our… yes – all that. But for us all to keep improving – and in the right direction – we all need to work toward one overarching goal, or mission. Enter the mission statement…

Ours is saving the world from cyber-menaces of all types. But how well do we do this? After all, a lot, if not all AV vendors have similar mission statements. So what we and – more importantly – the user needs to know is precisely how well we perform in fulfilling our mission – compared to all the rest…

To do this, various metrics are used. And one of the most important is the expert testing of the quality of products and technologies by different independent testing labs. It’s simple really: the better the result on this or that – or all – criteria, the better our tech is at combatting cyber-disease – to objectively better save the world :).

Thing is, out of all the hundreds of tests by the many independent testing centers around the world, which should be used? I mean, how can all the data be sorted and refined to leave hard, meaningful – and easy to understand and compare – results? There’s also the problem of there being not only hundreds of testing labs but also hundreds of AV vendors so, again, how can it all be sieved – to remove the chaff from the wheat and to then compare just the best wheat? There’s one more problem (it’s actually not that complex, I promise – you’ll see:) – that of biased or selective test results, which don’t give the full picture – the stuff of advertising and marketing since year dot.

Well guess what. Some years back we devised the following simple formula for accessible, accurate, honest AV evaluation: the Top-3 Rating Matrix!.

So how’s it work?

First, we need to make sure we include the results of all well-known and respected, fully independent test labs in their comparative anti-malware protection investigations over the given period of time.

Second, we need to include all the different types of tests of the chosen key testers – and on all participating vendors.

Third, we need to take into account (i) the total number of tests in which each vendor took part; (ii) the % of ‘gold medals’; and (iii) the % of top-3 places.

What we get is simplicity, transparency, meaningful sifting, and no skewed ‘test marketing’ (alas, there is such a thing). Of course it would be possible to add into the matrix another, say, 25,000 parameters – just for that extra 0.025% of objectivity, but that would only be for the satisfaction of technological narcissists and other geek-nerds, and we’d definitely lose the average user… and maybe the not-so-average one too.

To summarize: we take a specific period, take into account all the tests of all the best test labs (on all the main vendors), and don’t miss a thing (like poor results in this or that test) – and that goes for KL of course too.

All righty. Theory over. Now let’s apply that methodology to the real world; specifically – the real world in 2014.

First, a few tech details and disclaimers for those of the geeky-nerdy persuasion:

  • Considered in 2014 were the comparative studies of eight independent testing labs (with: years of experience, the requisite technological set-up (I saw some for myself), outstanding industry coverage – both of the vendors and of the different protective technologies, and full membership of AMTSO) : AV-Comparatives, AV-Test, Anti-malware, Dennis Technology Labs, MRG EFFITAS, NSS Labs, PC Security Labs and Virus Bulletin. A detailed explanation of the methodology – in this video and in this document.
  • Only vendors taking part in 35% or more of the labs’ tests were taken into account. Otherwise it would be possible to get a ‘winner’ that did well in just a few tests, but which wouldn’t have done well consistently over many tests – if it had taken part in them (so here’s where we filter out the faux-test marketing).

Soooo… analyzing the results of the tests in 2014, we get……..

….Drums roll….

….mouths are cupped….

….breath is bated….

……..we get this!:

Independent testing 2014:  the results

Read on: Are all washing powder brands the same?…

Doctor Doctor.

բարեւ բոլորին!

// Not sure if Google translated ‘Hello everybody’ into Armenian correctly. This is just to flag that I was in this exotic (for most readers) country, as usual for a nice mix of business and pleasure – both covered below.

Last week I had the honor of receiving a prestigious academic award from the State Engineering University of Armenia, which awarded me an honorary doctorate! Namely, ‘for an outstanding contribution in the field of information security’, and handed to me by the uni’s rector.

honorary_doctorate_eugene_kaspersky1KL/SEUA backgammon!

Hurray! And thank you!

This makes me a doctor in two countries! I’m now a ‘British-Armenian academic’, as some scoffed :) (my first doctorate was from Plymouth Uni).

Oops, beg your pardon – the above pic was a bit of fun. Here come the ‘proper’ photos…

Read on: proper photos тв Armenian landscapes…