In a separate report, ESG found that 18% of organizations believe their existing cybersecurity team can’t keep up with mounting threats, 22% believe their security team is not large enough to protect their organization, and—unsurprisingly—over half of these organizations reported suffering a breach within the prior two years: breaches whose success was contributed to by their organization’s skills shortage. And this volume of breaches caused by understaffed security teams is likely to only increase in coming years, as the cybersecurity skills gap continues to grows to 2 million unfilled jobs by the end of next year, with the cost of cybercrime projected to reach $6 trillion by 2021.
Much has been written about this skills shortage, but one big point has been missed. Most organizations have been trying to solve their cybersecurity problems entirely by hiring new cybersecurity staff, often to perform work that can be completed by machines. If these organizations embraced machine-driven automation, analytics, and AI—which are able to deliver faster detection and response than any human-only team—they would substantially reduce their need to hire cybersecurity staff in large volumes, and reduce much of the security skills gap we’re seeing—all while increasing the efficiency and effectiveness of their security services.
Some CISOs are waking to this and beginning to look to AI as mechanism that will be able to fill the security skills the market cannot fill. And yet, while AI will fill many of the jobs they have currently left unfilled, there are certain subtle elements of AI, and the cybersecurity skills gap that will still be in play. In fact, while AI solves certain elements of the skills gap, its deployment may create new areas of skills shortage.
What Jobs AI can Replace, and What Jobs AI can’t
Before we dive into the skills shortage being created by AI, it’s essential to understand what role AI can actually play in cybersecurity, and what skills the technology can replicate.
While we utilize AI in every step of our comprehensive left-to-right-of-hack Managed Detection & Response program, we admit that AI plays a much greater role in certain steps than others. AI primarily allows us to monitor, analyze, and answer questions regarding massive volumes of data. Our AI platform—AI.saac—is capable of analyzing hundreds of terabytes of threat data at one time, allowing us to continuously hunt through every corner of our clients’ networks in search of anomalous behavior. AI can not only process this high volume of data, it can also correlate behavior throughout the network, uncovering previously unknown attack patterns. Even though AI provides great assistance after a threat has been uncovered and needs to be responded to, ultimately AI primarily helps with all detection activity—threat anticipation, triaging, threat hunting, incident and threat analysis & investigation.
In short, AI mostly automates repetitive cybersecurity activities revolving around data collection and scrubbing, primarily related to threat detection, but it still relies on human experts to handle most of the higher-order tasks of cyber security. In this way, AI will help fill the cybersecurity skills gap, but it will mostly fill relatively simpler roles that are easier to fill and train for, while leaving the large gap in the more challenging and in-demand cybersecurity roles that require significant experience and insight.
To make matters worse, the widespread cyber security deployment of AI itself is already creating its own new skills gap.
The AI-Specific Cybersecurity Skills Gap
AI platforms create their own new cybersecurity jobs. Organizations need people to design, deploy, run, and continuously upgrade their AI platforms, and these skills are even more rare and valuable than most of the existing cybersecurity skills organizations cannot find in the marketplace.
To begin with, there is an overall global shortage of general AI talent. Anyone who wants to hire AI roles for their cybersecurity platform has to compete with everyone else who is hiring AI specialists for any deployment at all. Much of the existing AI talent in the marketplace appear to be getting poached by autonomous vehicle companies, and other “hotter” applications of cognitive computing. As noted by Bloomberg, there may be only 200,000 - 300,000 total AI researchers and practitioners in the world, with only ~22,000 PhD-level computer scientists qualified to work in AI. The result? As SecurityNow puts it, “large companies are shelling out $300,000 or more in salaries to engineers with modest AI experience and expertise.”
In sum, while AI will certainly help solve certain roles within the overall cybersecurity skills gap, it won’t fill the most valuable roles that require real human experience and insight, and will create a new skills gap for “cybersecurity AI experts”. For most organizations, the only way to fill this gap is to find a security partner to fill it for them.