Design For Behavior, Not Awareness

October was National Cybersecurity Awareness Month. Since today is the last day, I figured now is as good a time as any to take a contrarian perspective on what undoubtedly many organizations just did over the past few weeks; namely, wasted a lot of time, money, and good will.

Most security awareness programs and practices are horrible BS. This extends out to include many practices heavily promoted by the likes of SANS, as well as the current state of "best" (aka, failing miserably) practices. We shouldn't, however, be surprised that it's all a bunch of nonsense. After all, awareness budgets are tiny, the people running these programs tend to be poorly trained and uneducated, and in general there's a ton of misunderstanding about the point of these programs (besides checking boxes).

To me, there are three kinds of security awareness and education objectives:
1) Communicating new practices
2) Addressing bad practices
3) Modifying behavior

The first two areas really have little to do with behavior change so much as they're about communication. The only place where behavior design comes into play is when the secure choice isn't the easy choice, and thus you have to build a different engagement model. Only the third objective is primarily focused on true behavior change.

Awareness as Communication

The vast majority of so-called "security awareness" practices are merely focused on communication. They tell people "do this" or "do that" or, when done particularly poorly, "you're doing X wrong idiots!" The problem is that, while communication is important and necessary, rarely are these projects approached from a behavior design perspective, which means nobody is thinking about effectiveness, let alone how to measure for effectiveness.

Take, for example, communicating updated policies. For example, maybe your organization has decided to revise its password policy yet again (woe be to you!). You can undertake a communication campaign to let people know that this new policy is going into effect on a given date, and maybe even explain why the policy is changing. But, that's about it. You're telling people something theoretically relevant to their jobs, but not much more. This task could be done just as easily be your HR or internal communication team as anyone else. What value is being added?

Moreover, the best part of this is that you're not trying to change a behavior, because your "awareness" practice doesn't have any bearing on it; technical controls do! The password policy is implemented in IAM configurations and enforced through technical controls. There's no need for cognition by personnel beyond "oh, yeah, I now have to construct my password according to new rules." It's not like you're generally giving people the chance to opt out of the new policy, and there's no real decision for them to make. As such, the entire point of your "awareness" is communicating information, but without any requirement for people to make better choices.

Awareness as Behavior Design

The real role of a security awareness and education program should be on designing for behavior change, then measuring the effectiveness of those behavior change initiatives. The most rudimentary example of this is the anti-phishing program. Unfortunately, anti-phishing programs also tend to be horrible examples because they're implemented completely wrong (e.g., failure to benchmark, failure to actually design for behavior change, failure to get desired positive results). Yes, behavior change is what we want, but we need to be judicious about what behaviors we're targeting and how we're to get there.

I've had a strong interest in security awareness throughout my career, including having built and delivered awareness training and education programs in numerous prior roles. However, it's only been the last few years that I've started to find, understand, and appreciate the underlying science and psychology that needs to be brought to bear on the topic. Most recently, I completed BJ Fogg's Boot Camp on behavior design, and that's the lens through which I now view most of these flaccid, ineffective, and frankly incompetent "awareness" programs. It's also what's led me to redefine "security awareness" as "behavioral infosec" in order to highlight the importance of applying better thinking and practices to the space.

Leveraging Fogg's models and methods, we learn that Behavior happens when three things come together: Motivation, Ability, and a Trigger (aka a prompt or cue). When designing for behavior change, we must then look at these three attributes together and figure out how to specifically address Motivation and Ability when applying/instigating a trigger. For example, if we need people to start following a better, preferred process that will help reduce risk to the organization, we must find a way to make it easy to do (Ability) or find ways to make them want to follow the new process (Motivation). Thus, when we tell them "follow this new process" (aka Trigger), they'll make the desired choice.

In this regard, technical and administrative controls should be buttressed by behavior design whenever a choice must be made. However, sadly, this isn't generally how security awareness programs view the space, and thus just focus on communication (a type of Trigger) without much regard for also addressing Motivation or Ability. In fact, many security programs experience frustration and failure because what they're asking people to do is hard, which means the average person is not able to do what's asked. Put a different way, the secure choice must be the easy choice, otherwise it's unlikely to be followed. Similarly, research has shown time and time again that telling people why a new practice is desirable will greatly increase their willingness to change (aka Motivation). Seat belt awareness programs are a great example of bringing together Motivation (particularly focused on negative outcomes from failure to comply, such as reality of death or serious injury, as well as fines and penalties), Ability (it's easy to do), and Triggers to achieved a desired behavioral outcome.

Overall, it's imperative that we start applying behavior design thinking and principles to our security programs. Every time you ask someone to do something different, you must think about it in terms of Motivation and Ability and Trigger, and then evaluate and measure effectiveness. If something isn't working, rather than devolving to a blame game, instead look at these three attributes and determine if perhaps a different approach is needed. And, btw, this may not necessarily mean making your secure choice easier so much as making the insecure choice more difficult (for example, someone recently noted on twitter that they simply added a wait() to their code to force deprecation over time)

Change Behavior, Change Org Culture

Another interesting aspect of this discussion on behavior design is this: organizational culture is the aggregate of behaviors and values. That is to say, when we can change behaviors, we are in fact changing org culture, too. The reverse, then, is also true. If we find bad aspects of org culture leading to insecure practices, we can then factor those back into the respective behaviors, and then start designing for behavior change. In some cases, we may need to break the behaviors into chains of behaviors and tackle things more slowly over time, but looking at the world through this lens can be quite enlightening. Similarly, looking at the values ensconced within org culture also let's us better understand motivations. People generally want to perform their duties, and do a reasonably decent job at it. This is generally how performance is measured, and those duties and performance measures are typically aligned against outcomes and - ultimately - values.

One excellent lesson that DevOps has taught us (there are many) is that we absolutely can change how the org functions... BUT... it does require a shift in org culture, which means changing values and behaviors. These sorts of shifts can be done either top-down or bottom-up, but the reality is that top-down is much easier in many regards, whereas bottom-up requires that greater consensus and momentum be built to achieve a breakthrough.

DevOps itself is cultural in nature and focuses heavily on changing behaviors, ranging from how dev and ops function, to how we communicate and interact, and so on. Shortened feedback loops and creating space for experimentation are both behavioral, which is why so many orgs struggle with how to make them a reality (that is, it's not simply a matter of better tools). Security absolutely should be taking notes and applying lessons learned from the DevOps movement, including investing in understanding behavior design.

---
To wrap this up, here are three quick take-aways:

1) Reinvent "security awareness" to be "behavioral infosec" toward shifting to a behavior design approach. Behavior design looks at Motivation, Ability, and Triggers in affecting change.

2) Understand the difference between controls (technical and administrative) and behaviors. Resorting to basic communication may be adequate if you're implementing controls that take away choices. However, if a new control requires that the "right" choice be made, you must then apply behavior design to the project, or risk failure.

3) Go cross-functional and start learning lessons from other practice areas like DevOps and even HR. Understand that everything you're promoting must eventually tie back into org culture, whether it be through changes in behavior or values. Make sure you clearly understand what you're trying to accomplish, and then make a very deliberate plan for implementing changes while addressing all appropriate objectives.

Going forward, let's try to make "cybersecurity awareness month" about something more than tired lines and vapid pejoratives. It's time to reinvent this space as "behavioral infosec" toward achieving better, measurable outcomes.

About this Entry

This page contains a single entry by Ben Tomhave published on October 31, 2017 10:24 AM.

Incremental "Gains" Are Just Slower Losses was the previous entry in this blog.

The Thankless Life of Analysts is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.

Monthly Archives

Pages

  • about
Powered by Movable Type 6.3.7