A Stroll Down Amnesia Lane

I was cleaning out some old boxes of "stuff" from days gone by and ran into a hard copy of a presentation that I delivered as part of the interview process at CERT/SEI in Pittsburgh back in 1998. At the time, I had been very hopeful to get a job at CERT as they were doing security work that I simply wasn't seeing in the private sector (at least, not in the Midwest). Alas, it didn't work out, but I digress...

What jumped out at me about this presentation is that, in 12+ years, nothing has changed! The same arguments I made back then about needing to be proactive with security, working to integrate it into all aspects of the business in order to make it implicit and inherent are still true today. Perhaps the most interesting bullet in those slides for me was one where I asked "why aren't we teaching calculus and computer science in elementary schools?" I don't think my audience grokked the question back then, and I'd be surprised if people would even get it today.

The purpose of that question was not to necessarily imply that 1st graders should be learning calculus, but rather that our mode of thinking has been - and continues to be - completely backwards. We're doing everything we can to abstract learners - people - further and further from the results of their actions. That is, rather than teaching kids how computers operate and what the fundamentals are of certain principles in action, we instead wave our hands about and mumble something about the "miracle" of technology, hand out graphing calculators, and gloss over the hard details.

From a political perspective, we see this mindset playing out in programs like No Child Left Behind (or even now the revised AP exams), which puts an inordinately heavy emphasis on high-stakes testing and, by extension, rote learning. If you can't memorize a quick-n-easy tool to perform a calculation, then it's unlikely to make it onto a standard exam, let alone be taught in the classroom. 15+ years ago much of math education was about demonstrating proofs. Sure, we were perhaps hindered to a degree by too rigid of thinking then in that you had to learn the "right" proof (even if multiple proofs were acceptable), but we nonetheless were learning about a process, rather than just memorizing the process itself.

The same goes for computers and information security today. There is an ever-increasing gap (something Michael "Security Catalyst" Santarcangelo refers to as the "Human Paradox Gap") in the mind of the average person between their actions and the resulting consequences (whether they be negative, positive, or neutral). If we trace back this gap, then we can quickly see the connection to the abstraction that is occurring during more formative educational periods. This is not just a comment on inquisitiveness and kids today. On the contrary, it speaks directly to the mentalities that adults have, and how those mentalities get projected onto children. In our rush to make our own lives "easier" through technology that abstracts the underlying mechanics, we in turn promote an environment that is merely an abstraction of the underlying realities. To say that this is a wee bit disconcerting is an understatement.

Case-in-point, look at security policies today. Are these policies constructed based on a thorough understanding of the assets being protected, threats to those assets, and an understanding of organizational resistance strength? Absolutely not! Policies today are a hodgepodge of compliance requirements and "best practice" statements, none of which bear much in the way of relevance to daily business operations. It's no wonder policies are oftentimes hated, ignored, or subverted by the business! Policies have rarely (if ever) been created in a manner that is "real" to the actual business stakeholders.

As such, we have an abstraction problem where the average user has no real vested interest in even understanding policies, let alone complying with them. The only real approach that we've used historically to address this issue is by using a stick to beat people when they aren't toeing the line. Where's the incentive? As a former VP of mine used to say "You're responsible first and foremost to your direct manager. If he/she isn't happy with your performance, then your performance review will be negative, impacting your future with the company." People are generally working to make their bosses happy, and security isn't often a factor in that relationship. Until we make infosec an inherent component of all duties, we'll simply not see a change.

Of course, one could then counter with a question about whether or not security even should be part of everyone's responsibilities. My answer is "yes, of course!" - but not because I'm some zealot. Instead, I point at historical performance in this regard. When security is the owned responsibility of another group, we immediately see the impact of SEP. If you're responsible for meeting your boss's requirements, and if those requirements do not include security requirements, then you're not going to see any real interest in improved security.

How do we get past this point? Simply put, we need to get away from the rote-learning approach that security has become. Put aside the mindless "best practice" statements in policies and instead work toward a new goal: security performance metrics integrated into all job descriptions. Develop metrics around infosec or risk that are then tied directly to performance reviews and, ultimately, compensation. When poor security practices start costing people money, maybe then that gap between actions and consequences will begin to narrow insomuch as people will be highly motivated to better understand the underlying mechanics.

My Other Pages

Support Me

Support EFF


Bloggers' Rights at EFF

Creative Commons License
This blog is licensed under a Creative Commons License.
Powered by Movable Type 5.2.10