We can't build the cybersecurity workforce on passion alone

Envisioning the transition of cybersecurity from a passion and skill-driven activity to a casual business profession.

We can't build the cybersecurity workforce on passion alone

I recently attended the NSec, a major offensive security event in Montreal. I enjoyed myself. The event had free beer, a good DJ, and a bunch of security nerds willing to share their latest tinkering feats. This is hacking at its core: folks who love to figure out how stuff works and use it in unintended ways. It's the type of conference that ignites passion. It's also bleeding money.

It was my first time attending, and I went with a networking mindset. There were about 10 booths, many of which were government or government-backed startups. In other words: institutions that are not profit-motivated. None of the common security companies attended. The message struck me: there's no business in passion.

I can't think of a better illustration of how I've grown to see the cybersecurity profession: passion is fun but it's not the solution to protect our infrastructures from devious actors at scale.

Longtime readers know my story: I got into security thanks to the Mr Robot TV show: hacking for revolution at its finest! Nowadays, I write compliance reports and risk assessments for executives. In the future, I see this transition as the norm.

This is where I get provocative: I believe the future of security lies more in an accountant paradigm than in hacking. Slow and boring.


Pentest is overrated

A pentest job posting will garner more than double the applications for an incident responder role. And half of the candidates for the incident response role want to pivot into pentesting.

My TikTok videos about tractor hacking and deepfake sextortion were amongst my biggest hits. Hacking is cool! These feats rely on rare skill sets akin to magic for most people. Over the years, it's permeated pop culture and grown its aesthetics. Hoodies, anyone?

However, hacking suffers from a major flaw: it finds issues way too late in a product's lifecycle.

To me, pentesting should be a branch of compliance.

Let me explain: software building should follow guidelines, which create auditable artifacts. These artifacts then get correlated. Compliance systems trigger alerts upon guidelines being overstepped, leading to scrutiny. Pentests can become the ultimate weapon to verify application builders who trick compliance checks.

All companies, even small ones, must master their balance sheets and pay taxes. Accounting is a necessary hurdle. Wise accountants become business advisors. No decision gets taken without making sure that rules are followed. Pop culture also features these tasks as white-collar boring jobs.

I understand how accounting being tied to money gives it more gravitas than cybersecurity ever will. However, when I'm looking at how accounting firms are gobbling up all cybersecurity firms, I'm rooting for them.

Cybersecurity should not rely solely on the skills of a happy few. It should base itself on rules and, yes, red tape. I said it.


What does boring security mean in practice?

The cybersecurity of the future will look more like underwriting and accounting. Based on actuarial tables, cyber insurance systems will connect to companies' infrastructures to assess risks, based on the automated compliance data collection I've described above.

The systems will evaluate a given product's risks and based on a company's tolerance level, decide on whether additional scrutiny or resources must be allocated.

I also envision a security rating output, not dissimilar to the Common Criteria, but dynamic.

Humans will determine business outcomes, report findings, and fine-tune the systems.


How will we hook people to cybersecurity, then?

Keeping up with the accounting analogy, I believe young people should seek this area because it's good, steady, office jobs. It's awful boring, but it works.

Computer scientists and software engineers would remain crucial to build the artifacts-gathering systems, but all analysis could be carried out by people who are not experts in the underlying computing systems.

Perhaps this is it: we wouldn't need to hook them because the entry does not require extreme skills.

Most of the security specialists, myself included, spend their weekends and evenings thinking about security matters. This is such a staple of the job that I warn my students that this discipline will trample them if they expect a simple 9-to-5 and think about anything else the rest of the time. That doesn't scale. We need to lower the barrier.


What about advances in technology?

The rapid-evolving threats and technologies do weaken my comparison with accounting. Yes, tax codes change annually. But rarely do they force paradigm shifts similar to mobile devices, cloud computing, and large-learning models.

Advocating for a more standardized, rules-driven security means the rules must adapt. The aggregated compliance systems, the automated dynamic security rating and the adaptative actuarial tables I'm dreaming of cannot be built out of thin air. It's impossible to assess the quantifiable risks of LLM incidents. Social engineering, as a branch of cyber fraud, does not rely on flawed systems: it's a human problem. Maybe it shouldn't be a worry of cybersecurity anymore?

Technology moves fast, and cyber threat actors will always adapt more quickly than complex enterprise systems.

Does this mean this vision of a boring security will collapse? Do such limitations make sacrificing passion for white-collarism worthwhile? Time will tell.

What do you think? Tell us in the comments, or reply to this email to debate with me!