Security, Development, Incentives, and Acceptable Use

James McGovern has an excellent post recently on how agile development has grown to become the antithesis to security (see "Agile is the antithesis to security..."). He argues, quite correctly, that agile development is really just short-hand for "really crappy coding practices". As Telic Thoughts discusses, security has to be built-in via quality software engineering principles - the very thing that is missing from agile dev practices. Unfortunately, these days there is very little "engineering" involved in software development. To me, this seems to be a side-effect of the evolution of the Internet. Most "applications" are web-based, written in some sort of scripting language, rarely compiled, and almost never optimized.

These practices, or lack thereof, contribute to a state of insecurity that plagues the enterprise. However, none of this should surprise us because there's little incentive for businesses to implement secure practices (see Bruce Schneier's post "Perverse Security Incentives"). Business is incentivized to do what the business ideally does best: make money. Anything that gets in the way of that purpose - including security - is often seen as a negative detractor; something to be ignored. Oftentimes we in the security industry make it very easy for this attitude to pervade the enterprise, putting us at a disadvantage.

When you get down to it, the trick here is in finding a way to position security as an enabler; as a way to optimize the business. As Schneier notes in his post, Legal departments have been successfully making this case for years. After all, if you reduce liability exposure, you're reducing the risk of a significant financial loss. For security - and, really we should say IT security - we need to accomplish the same task within our different genre. Reducing risk resulting from poor IT security practices should be seen as a way of reducing the likelihood of financial loss. In fact, this is the exact goal that regulations like the Payment Card Industry Data Security Standard (PCI DSS) are hoping to accomplish (leaving aside who is making the changes and whose risk is being reduced).

Toward this end, one of the areas where one can make ready changes to reduce your risk profile is through a combination of well-written (read: accessible to the average user) security policies and an active training and awareness program. In terms of policies, one of the best places to start is with acceptable usage policies (for an example, see the Think Smarter blog post Acceptable Usage Polices). As discussed in the Think Smarter piece - and a favorite axiom of mine - if you don't set expectations for performance and behavior, then you shouldn't be surprised when people don't meet them. Or, put another way, if you don't spell out who can do what with data, then people will get creative and do what they think is best, whether or not it's appropriate, in the best interest of the company, or performed in a proper (secure) manner.

Just having policies, however, is not adequate. You can write policies until the end of time, but they won't mean a thing if they aren't written in understandable language, distributed to all personnel, kept in a place known to and accessible to all personnel, and reinforced through training and awareness initiatives (incidentally, as required by the PCI DSS).

First, on understandable, or accessible, language... what I'm talking about here is writing policies, standards, guidelines, etc., in plain English (or whatever your locale's native language might be) using vernacular that is common to everyone. While it's easy to write broad, general, abstract policies, be careful not to just leave them in that format, but rather support them with more detailed guidance, including examples, that get the point across. For example, just saying "computing resources must only be used for business-related purposes" is a fine start, but back it up with examples of good or bad behavior, such as "for example, email is not to be used for personal business, nor should sensitive materials be sent via email without being encrypted." Linking to the IT department at this point, or to approved tools for encryption, would make this statement even better.

Speaking of linking out, how do you publish your policies? Are they all in print somewhere? Or maybe published as Word or PDF documents? This might be fine in terms of trying to protect the content from change, but are these documents really accessible, then? What about publishing them in a wiki or sharepoint site format? If everyone knows to visit a given site, then that's where your policies, et al, should be. Most wikis, in particular, provide access controls to limit who can edit policies, and the text is automatically searchable. Using tools like a wiki are a great way to improve the accessibility of your policies, while providing an innovative way to cross-reference applicable tools and information.

Last, training and awareness is an effective way to reduce the risks associated with bad practices by well-intentioned personnel. Training allows you to put clear examples in front of folks as to what they should and should not be doing. Awareness around basic topics like proper handling of data, the types of hacks that may be used against them directly, and even the basics of risk assessment and risk management can be useful. The key, however, is keeping each training session lively, dynamic, and filled with quick, memorable bits on important topics. The best examples will be from your own organization, presented in a non-indicting manner (excluding names) to demonstrate that given threats are real for the organization.

In the end, if you put this all together - improved engineering practices in software development, understandable and accessible policies - including acceptable use policies, and security training and awareness - you should see measurable reduction in overall risk to the enterprise. Best of all, none of these changes should represent a major cost increase, and if done well could even help reduce costs by virtue of improving general practices and efficiencies.

(cross-posted to http://www.t2pa.com/cores/security-and-privacy/practical-security)

About this Entry

This page contains a single entry by Ben Tomhave published on March 2, 2009 3:11 PM.

You Can't Manage Security Without Risk Management was the previous entry in this blog.

Attacks on Chip n PIN, TomTom (Linux) is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.

Monthly Archives

Pages

  • about
Powered by Movable Type 6.3.7