Home » Technical Topics » Data Security

Reasons for the Cybersecurity Talent Gap

Manager Scanning Employee Data Via Antivirus App
Cybersecurity remains one of the hardest technology areas to fill.

According to a National Institute of Standards and Technology (NIST) project called Cyber Seek, there are around a million people employed in cybersecurity roles in the US.

Cyber Seek estimates there are close to 600,000 US vacancies in the field. Moreover, vacancies will grow sharply through 2025.  

Globally, (ISC)², a nonprofit dedicated to training and the advancement of cybersecurity professionals, estimated there were 2.7 million open job requisitions at the end of 2021, 1.4 million of which were in Asia-Pacific.

The odd thing about HR for IT overall has always been its obsessive and unthinking focus on finding and hiring people to patch holes. Rather than buying the enterprise a new pair of ripstop pants, HR hires more and more people to mend the ever-increasing number of holes in the existing pair.

Instead of easing this problem, cloud computing has exacerbated it. Cloud is about handing compute, storage, and software stack responsibility off to anything-as-a-service (XaaS) providers. During the 2010s, enterprise employees found it easy to subscribe to software-as-a-service (SaaS) offerings. Subscribing to a SaaS became a way to bypass the IT shop. 

By now, SaaS as the default has become a fait accompli. SaaS management software provider Better Cloud estimated that SaaS’s share of total company software in use globally reached 70 percent by 2021. By 2025, 85 percent of the software organizations use could be in SaaS subscriptions. Statista estimated that the number of SaaS providers worldwide reached 25,000 in  2021.

What do these statistics together tell us? It isn’t that the cybersecurity pros are doing a poor job. Instead, what’s happened is that IT system engineering as a whole has failed. Since the dawn of cloud computing, we have built a panoply of shared services. But those services aren’t designed to deliver and harness the power of scaled out, meaningful, connected data across departments and enterprises. Instead, they’re just giving us more silos.

That’s old, application-centric architecture peddled as new. Data-centric architecture, the kind actually needed for scalable, connected, contextualized data sharing essential to better AI, is a lot harder. Good engineering is tough because you’re not just solving one small problem at a time.

As Master Algorithm author and computer science professor Pedro Domingos points out, “Good engineering is not about piling on the hacks. It’s about simplifying, simplifying, and then simplifying some more.” But instead of simplifying matters, most computer evolution “advances” in terms of the bigger AI picture have made matters more complex. 

By focusing merely on smaller problems, the bigger, nastier systems-level and data disambiguation and logical abstraction problems have spiraled out of control. Throwing more CS grads into cybersecurity roles won’t solve these bigger problems. Instead, we need all the systems, knowledge, and data engineering we can get so that we can reduce the size and complexity of the problem space that cybersecurity staffers must confront.

Reasons for the Cybersecurity Talent Gap
Image by Kevin Gill on Flickr

Simplified knowledge sharing: Rationalization and symbiotic design and development

Humans are complicated, and the knowledge they’re trying to share is complicated too, says Juan Sequeda, principal scientist at data.world. But that doesn’t imply that the means of sharing knowledge once we create it can’t be as simple as possible. Knowledge sharing has to be simple to scale. 

According to the Software Peter principle, the first warning sign that a software project has gone off the tracks is that it has become too complicated for everyone, including the people on the project. The result is that 85 percent of the team’s time is spent communicating with people, rather than communicating with computers.

Contrast the Software Peter principle with Mythical Man-Month author Fred Brooks’ concept of conceptual integrity. A software design has conceptual integrity when the programming idioms themselves are rationalized. That means idioms are semantically connectable and consistent across code (think contextualized, discoverable microservices), so that the same idioms can work across projects. 

Another way to think about conceptual integrity is that it’s the symbiotic blending of design and development. Rationalization through design implies the ability to scale and reuse the means of development, which leads to a high software functionality/man-month ratio.

Knowledge Sharing as blended logic and data sharing

Forty years later, design is most effectively blended with development inside semantic knowledge graphs. Declarations, rules, and constraints that live in the graph constitute most of the machine-readable logic needed for lots of different kinds of functionality. A small fraction of code that must be written from scratch is executable. The rest is called from the graph.

The data model, alongside instance data that lives with it, evolves and grows organically with the help of reasoning and the melding of complementary graphs. These data and logic “organisms” are designed to link together easily and complement one another. The functionality/man-month ratio for well-designed knowledge graphs can be quite high. The blended design and development create the evolution and scaling necessary to create a living knowledge foundation for generalizable AI.

But the more we stay stuck in an application-centric world overwhelmed with complexity, the more we’re consigning cybersecurity professionals to treading water all the time when they could be on dry land and able to do more than survive and perpetuate existing legacy.