My lonely walk down the path to heresy started in 2008

As an information security professional, I’ve read countless articles about the so-called myths of cybersecurity, with each one confidently claiming to debunk the latest misconceptions. These articles fire off lists of five, seven, or ten myths like a machine gun at a carnival booth. For years, I took these lists to heart, ready to whip…

As an information security professional, I’ve read countless articles about the so-called myths of cybersecurity, with each one confidently claiming to debunk the latest misconceptions. These articles fire off lists of five, seven, or ten myths like a machine gun at a carnival booth. For years, I took these lists to heart, ready to whip them out like a magic wand during any debate.

Some classics include “strong passwords are always safe,” “you can buy good security,” “compliance means security,” and my personal favorite, “things are getting more secure.” Those were the good old days, the simple days. But, as with everything, change is inevitable—even for those of us in information security.

A few months back, I found myself listening to a colleague rant about a company he was working with. The more he raged, the more my mind wandered. And then, a mischievous thought popped into my head. In my best innocent tone, I interrupted his tirade with, “Wow, that company sounds like a mess. When was their last financially devastating breach?”

He paused, clearly waiting for the punchline that never came. A look of confusion crept onto his face as he scanned the room, probably expecting someone to jump out and yell, “You’ve been punked!” But no one did. I simply sat there, waiting for an answer I already knew.

Spoiler alert: the answer was “never.” The company had never experienced a significant breach. No financial calamity, no major incident—just smooth sailing for years. And that, my friends, is where my heresy begins.

If this company had saved millions by skimping on security and still hadn’t suffered a catastrophic breach, were they wrong? The evidence didn’t exactly support my colleague’s dramatic claims. So, either management was playing 4D chess with cybersecurity or they were just cheap and lucky. Either way, they hadn’t taken a hit, and that’s tough to argue with.

Now, I know what some of you are thinking: “Just because they didn’t detect a breach doesn’t mean one didn’t happen.” Sure, that’s possible. But is it probable? In the world of cybersecurity, we often confuse “possible” with “probable.” We talk about potential threats and vulnerabilities with passion, but how often do we stop to think about their actual likelihood?

For years, fear, uncertainty, and doubt (FUD) have been the tools of the trade in our field. Call it risk management, call it qualitative risk assessment, or as one executive I knew once called it, “consulting the gypsy in the parking lot.” The fact remains: predicting the probability of an actual breach is harder than proving global warming to a climate change denier.

But let’s get back to reality for a moment. Sure, it’s possible that bad actors slipped into that company’s systems and stole sensitive information without anyone noticing. But it’s not very probable. If there was a significant breach that cost the company real money, wouldn’t someone—an accountant, a shareholder, anyone—have noticed by now?

Despite our best efforts, most of the time, the threats we obsess over exist in a world where probability approaches zero. Yet, we preach about them as if they’re lurking around every corner, waiting to strike. How often do we hear one of us say, “It’s possible, but not very probable”? Not nearly enough.

I’m not saying we should throw caution to the wind and ignore threats altogether. Some industries—like government and finance—absolutely need robust security measures. But for many organizations, the truth is a little less dramatic. Most breaches don’t lead to the catastrophic outcomes we fear, and sometimes, doing nothing is the most cost-effective solution.

Take a friend of mine, for example. He once detected a large-scale malware infection across hundreds of machines. He sounded the alarm and spent days trying to convince the desktop support team to take action. They didn’t. Two weeks later, the anti-virus vendors rolled out a signature that cleaned up the infection, and life went back to normal. My friend lost sleep, but in the end, his effort didn’t make a difference. The company suffered no significant financial loss, and everything carried on as usual.

The lesson here? Sometimes, fortune really does favor the foolish. The days of waiting for the next Slammer or Blaster worm to crash the network are over. The bad guys have learned that being noisy only gets their malware crushed faster. Nowadays, their goal is to stay under the radar and keep making money.

Now, before you grab your pitchforks, let me clarify: I’m not advocating that we abandon information security altogether. There are certainly cases where strong controls are necessary, especially when lives or large sums of money are at stake. But in most cases, we need to be realistic. The threats we face are often improbable, and the cost of implementing security measures should reflect that reality.

It’s easy to scare people into spending money on cybersecurity by emphasizing worst-case scenarios. But over time, that approach wears thin. Decision-makers grow desensitized, and soon, they stop listening. Instead, we need to focus on probability, not just possibility. And before recommending complex controls, we should make sure the cost of the threat justifies the expense.

As information security professionals, we need to remember that business is about making money, not securing systems for the sake of security itself. Sure, security is important, but it’s not the end-all, be-all. At the end of the day, doing nothing might just save your company more money than anything else you could recommend.

So, next time you’re in a meeting, temper your assessments. Remember that most of the threats we talk about are improbable, and even if they do occur, the likelihood of a significant financial impact is often small.

Leave a comment