Recently in musings Category

I have a pet peeve. Ok, I have several, but nonetheless, we're going to talk about one of them today. That pet peeve is security professionals wasting time and energy pushing a "security culture" agenda. This practice of talking about "security culture" has arisen over the past few years. It's largely coming from security awareness circles, though it's not always the case (looking at you anti-phishing vendors intent on selling products without the means and methodology to make them truly useful!).

I see three main problems with references to "security culture," not the least of which being that it continues the bad old practices of days gone by.

I recently had the privilege of attending BJ Fogg's Behavior Design Boot Camp. For those unfamiliar with Fogg's work, he started out doing research on Persuasive Technology back in the 90s, which has become the basis for most modern uses of technology to influence people (for example, use of Facebook user data to influence the 2016 US Presidential Election). The focus of the boot camp was around "behavior design," which was suggested to me by a friend who's a leading expert in modern, progress security awareness program management.

Thinking about how best to apply this new-found knowledge, I've been mulling opportunities for application of Fogg models and methods. Suddenly, it occurred to me, "Hey, you know what we really need is a new sub-field that combines all aspects of security behavior design, such as security awareness, anti-phishing, social engineering, and even UEBA." I concluded that maybe this sub-field would be called something like "behavioral security" and started doing searches on the topic.

Confessions of an InfoSec Burnout

Soul-crushing failure.

If asked, that is how I would describe the last 10 years of my career, since leaving AOL.

I made one mistake, one bad decision, and it's completely and thoroughly derailed my entire career. Worse, it's unclear if there's any path to recovery as failure piles on failure piles on failure.

Reflection on Working From Home

In a moment of introspection last night, it occurred to me that working from home tends to amplify any perceived slight or sources of negativity. Most of my "human" interactions are online only, which - for this extrovert - means my energy is derived from whatever "interaction" I have online in Twitter, Facebook, email, Slack, etc.

Introspection on a Recent Downward Spiral

Alrighty... now that my RSA summary post is out of the way, let's get into a deeply personal post about how absolutely horrible of a week I had at RSA. Actually, that's not fair. The first half of the week was ok, but some truly horrible human beings targeted me (on social media) on Wednesday of that week, and it drove me straight down into a major depressive crash that left me reeling for days (well, frankly, through to today still).

I've written about my struggles with depression in the past, and so in the name of continued transparency and hope that my stories can help others, I wanted to relate this fairly recent tale.

If you can't understand or identify with this story, I'm sorry, but that's on you.

In the world of DevOps we often like to talk about rapid iteration in relationship to shortened feedback cycles, and yet oftentimes something gets lost in translation. Specifically, just because failure is ok, because failure leads to learning, it does not mean that we shouldn't be thinking at all. And, yet... it's all too common!

We all know there are problems with security. We all know that things aren't keeping pace or improving measurably and meaningfully at a rate or in a manner that most of us would deem sufficient or acceptable. Yet, all we seem to be doing is continuing to cast stones, castigate decision-makers, and pound the FUD drum. Why isn't anybody talking about addressing the core obstacles?

The Heart of DevOps Is Cooperation

I've been reading a lot lately about generative culture at the suggestion of my boss. Apparently this topic has been popping up and circulating with frequency through DevOps circles in recent months, and seeing as I'm currently charged with doing "stuff" related to security and DevOps, it seemed like a good thing to research.

For those unfamiliar with generative culture, I recommend reading up on it. I found these pieces to be of particular value:


What's most interesting about generative culture is that it really fits well with the current problems facing organizations today with respect to security. That is, infosec spend is still continuously viewed as overhead cost, infosec people are still viewed as obstacles (even when trying to play nicely with DevOps teams), and infosec tools continue to be undermined by the human element, which often sees security as an externality to their specific duties (even when it really oughtn't be).

Alone In This World

(pre-comment: if you've never personally dealt with depression, you may not understand)

One of the things I'm coming to find in this life is that, at the end of the day, we're all alone in this world. Even if we're surrounded by friends, they're external to our lives and will never be inside the darkest of places: our own heads.

We see this play out in many ways. Maybe it's the unintentional neglect of a friendship that highlights loneliness. Maybe it's the negativity of "friends" in reaction to new ideas. Maybe it's just that inner voice, reminding you of the darkness within. No matter how you cut it, we are all alone with ourselves.

Unless you've been offline in a remote land for the past month or so, you've undoubtedly heard that the 2016 VzB DBIR is out. As with every year, two things have happened: 1) DBIR is now the basis of almost all infosec vendor marketing promos, and 2) data analysts are coming out of the woodwork to levy the same old criticisms and accusations that we hear every year.

At the end of the day, there's a few consistent takeaways. First, yes, the data is biased. All data is biased. That's life. Welcome to data analysis 101. There's no such thing as "pure objectivity," only "more or less subjective." Second, yes, the data is dirty. It's inevitable, especially at scale coming from multiple sources. I think the bulk of the incident data is decent. Where things, as always, go off the rails is around the much-maligned vulnerability section (for example, read Dan Guido's criticism pieceread Dan Guido's criticism piece, which links to others as well). Third, for all the noise and drama and bickering and ad hominem attacks, my conclusions don't change. At. All.

My Other Pages

Support Me

Support EFF


Bloggers' Rights at EFF

Creative Commons License
This blog is licensed under a Creative Commons License.
Powered by Movable Type 5.2.10