Logo

Generic selectors
Exact matches only
Search in title
Search in content
Search in posts
Search in pages

THE PRAGMATIST’S GUIDE TO AI CYBERSECURITY SKILLS

18/10/2019

Spencer Young, Regional Vice President EMEA at Imperva

 

Security teams – like the rest of the business world – are used to hearing about how AI and automation are going to change their jobs, hopefully for the better. Research from LinkedIn recently found that specialist machine learning jobs have seen a six times increase over the last year. But there’s a major problem with the sheer number of new skills that employees would need to learn in order to become experts in emerging AI technology. There’s a risk that if too much is asked of cybersecurity professionals in terms of upskilling, the overall balance of work actually tips in the wrong direction, leaving them overburdened.

But it’s nevertheless the case that AI will have a transformative impact on cybersecurity practice over the coming few years – so no-one can afford to ignore it. How can you achieve true balance?

The answer can be found with the three bears: not too much training, not too little – just the right amount. AI training programmes should be carefully aligned to business need and individual responsibility, paring down the mountain into usable, and necessary, chunks. It’s the same principle that guides all technology change: don’t overburden your teams with masses of learning, but also don’t allow your company to slip behind the competition through fear of change.

That balance is particularly tough when it comes to AI. It’s been so heavily trailered over the last few years that the whole world seems to be holding its breath for the inevitable robot invasion. But in reality, we’re still a long way from truly independent artificial intelligence. Too often, when people talk about AI, they mean some derivation of automation or machine learning – processes which still require a good amount of human input and minding.

We’re still essentially at the level of human-guided computer programmes – but as the complexity of their functions increases, so does the amount of training required to run them. With that in mind, here are three key areas to consider when implementing AI and automation.

 

Which skills are essential?

Before you get started, dedicate plenty of time to working out which machine learning solutions are best matched to your security needs. Don’t flood your security team with new tools just because they sound good – do they address problems your company faces on a regular basis?

For example, automated email text analysis is likely to be useful for most companies, helping their filters spot suspicious messages with more accuracy. Response orchestration, on the other hand, may be less urgent for large security teams with plenty of manpower than small ones without enough bandwidth to manage every task.

By selecting only those solutions that address essential needs, you can keep upskilling requirements to a minimum, whilst ensuring you have the best possible defence in place.

 

Lead from the front

Once you’ve decided which AI solutions will benefit your business, it’s a good idea to establish new ‘skill heads’ – for example, to manage automated detection of insider threats – in tandem with upskilling existing staff. By providing a single point of contact for each new skill, you provide not only a friendly face to answer questions, but also a role model for others in the company to follow – a humanisation of the job in hand. ‘We’re rolling out a new system’ sounds like a lot of work; ‘Anna is going to show you how she’s been using this new tool’ is more manageable.

By giving one person responsibility for the uptake and rollout of the new skill also helps to ensure that any wrinkles are picked up quickly and dealt with, and that updates and changes to the system are quickly folded into the practice of the business. It also saves each team member from having to learn the whole system from scratch – if there’s a single point of contact, the skill head can act as a one-stop-shop for advice on that particular system.

 

Remember the reality of AI

A realistic view of how advanced AI technology is at present should always underpin your security training and purchasing. Businesses need to make investment of time and money without trend chasing. Working with an experienced provider can help – having a knowledgeable voice to advise you on which solutions are worth the time can make the difference between improving your security team’s day-to-day life and piling extra work onto them.

AI is set to be one of the most effective labour-saving, accuracy-improving additions to the cybersecurity toolbox. 2019 will see it move ever closer to the mainstream, while automation and machine learning will become commonplace. Make sure that your team can make the most of these new technologies by being realistic about your training regime – those who pile too much on without support will find their investments have the opposite effect…

 

YOUR COMMENT

Your Comment

Email (will not be published)

Finance Derivative is a global financial and business analysis magazine, published by FM.Publishing. It is a yearly print and online magazine providing broad coverage and analysis of the financial industry, international business and the global economy.

Copyright @ 2018-2019. Finance Derivative. All Right Reserved