Wednesday, April 21, 2010

Wall of Shame: Security Education and Awareness

"It’s difficult to get a man to understand something when his salary depends on him not understanding it." - Al Gore (shamelessly ripped from 'Beautiful Security', Mark Curphey's chapter)

I had this discussion with two other security professionals today about security education and why it fails. This is another common failing within security. I'm going to break this into two categories:

  1. What security guys do wrong.
  2. What everyone else does wrong.
I'm also basing this on my own experiences too - including my mistakes. Too many people in this field are too busy talking about how bloody rockstar awesome they tell people about all the things they wish they did differently. Well not here. I've made more than my fair share of mistakes. I like to think I made them so other people won't have to.

What security guys do wrong
For starters, we're not prioritising well.

We try and target user behaviour because that's where the bulk of root cause stems from right? I mean it makes sense on a superficial level. Users are tricked by social engineering attempts, they run executibles they shouldn't, they hold doors open for total strangers to secure areas, etc. You get the idea. But is this a failing of the user or a failure of the system that allows and if not perpetuates this behaviour?

Good question. We'll get back to that another day though.

But right now, lets say its our fault. We blame the (l)users because its easier than admitting the truth. Our policies are crap because they're unenforceable. We don't hold people to account. We don't embed our controls into processes to otherwise automate things to make security as automatic as possible.

But what if we start being judicious in our education and say, start educating developers on secure programming practises and focused our efforts on them understanding the vulnerabilities, how they manifest in crappy code and demonstrate alternative methods for cutting code? Sure you won't solve the user behaviour but I bet your code will improve significantly. What if you focus on getting your administrators to harden their build environments. Yes, it won't stop stupid user behaviour - but it will help harden your environments.

We need to start picking the battles we can win, one day at a time.

How can you tell if you are on the winning side? Here's a simple test - if you're relying on your users to be secure by having a solid education and policy in place but lack controls to detect or enforce them, then you are doing it wrong.

I've spent massive amounts of time trying to educate users through massive education campaigns on the do's and don'ts of computer security. While I was proud of the work I've done in that space, I can honestly say now that those efforts were misguided and misspent. I could pick far better targets where that time and energy would be better spent.

What everyone else does wrong

This comes down to one simply axiom - people screw up because they don't know any better. Or perhaps, don't want to know any better.

Let's talk about programmers (hey, they're easy to bash on). They are often flogged to churn out functioning code in short sprints, regardless of other factors. So its often quite natural that security doesn't make the cut. Meeting deadlines, stability, functionality, interoperability - these are the true principles of today's programmer. Can we blame them? No - not at all. They have the same uncaring masters we do. Security isn't prioritised. And much like the user, they aren't given proper alteratives. So, to summarise - they aren't educated on security matters, they are rushed and often given unreasonable alternatives.

Let's digress for a sec and take a quick peek at four comp sci undergraduate degrees.I say comp sci because for this purpose, I want to compare apples with apples. And these were picked off the top of my head btw:

RMIT, University of Melbourne, Monash, Latrobe.

Not one of them offers a compulsory security unit for any of the bachelor degrees! (Although there are some standouts, lets be clear that NONE of them offer a mandatory security unit).

It is no wonder why developers are churning out insecure code. We shouldn't be blaming them, we need to blame the institutions responsible for qualifying them. As a profession we need to be more involved in the education sector and stress that these fundamental skill shortages are addressed.

Similar sob stories exist for the business stakeholders out there, the project managers and the like. They are paid to get a job done, on time, on budget (or less if they can!). Quality is often a negligible factor in whether they are deemed successful. Security is barely a blip on the radar of many of them. It is hard to get them to care when they are paid not to. If you're successful at engaging, can obtain early buy-in, organisational support and embed your own processes into a project to minimise all burdens on a project and come in on time and early (thus help the PM to be successful) you might be able to win the battle.

Again, for the operations managers. They're often there to keep the ship afloat. Uptime is more important than security. Unless you have an operations manager with an ounce of security savvy its hard to get them to justify applying a patch if it means downtime to core production hosts. I've heard absolute horror stories of infrstructure operations managers who refused to apply patches for fear of causing downtime. These same patches they chose not to apply allowed Conflicker onto their network. I think a lot of us have heard similar war stories.

But yes, Mark Curphy/Al Gore hit the nail on the head here.

So where to from here?

Honestly, this is a failing on our profession.

We talk about raising security education and awareness. I'm a big believer in it. However, our efforts are misguided. Nobody tells us how/where to spend the time and energy so we learn by trial and error. Its time to stop wasting our time and start chalking up wins on the board.

We do know better. We have no excuses.

- J.

3 comments:

bimbimboumboum said...

Education's a difficult topic. At first, the idea that we can educate each and every programmer and user in the organisation about security is attractive. The rational being: if they know about the risk, they will be careful, hence the overall security of our organisation will be better.

However, in reality people don't always act that way. Who hasn't drove over the speed limit when being in a hurry to get somewhere?

Of course, security training and education will provide some positive results. Probably those results will be hard to quantify. Also given our constantly changing environment of people (i.e people leaving the company and new programmers being hired) security education is very hard to scale and keep up to date.

Is there a lack of security training in the current Australian education system? Definitely. After all university is all about copying the code of "the one who knows" in time to hand in your assignment. And you wouldn't believe how bad programmers even the lecturers are.

For me security training then falls down on the individual: for some people it will work, for the others, they won't bother about it. And that's the very limit of security education.

We are once against facing the concept of externalities: where the impact of a vulnerability being exploited is not directly affecting the people who developed the code of the vulnerable software.

The Al Gore quote then suggests that an improvement could come from an economic change: develop a hiring system where programmers with security skills are being paid more than programmers without those skills. Also, developing security skills could become a step forward in one's career progression.

Additionally, could a developer be hold legally responsible if the piece of software he develops gets hacked? From the perspective the people who use that software I do not see why not. After all, if you go change the tyres of your car and have an accident a week later because of those very same tyres you've changed, you probably could take legal action against the people who repaired your car. Why can't a similar concept be applied to software? Developers would have to prove they've taken all necessary steps to provide a high level of assurance that the software will be hard to hack.

Jarrod said...

Education isn't intended to solve the security problem entirely, but it does mean one less excuse - which is the point I'm trying to make.

Fixing borked processes and economical descisions that push unnecessary risks, is the key.

Interesting point you make about liability for developers. From what I hear, some people are looking at that after some very public failures of key projects that go back to software problems. I wouldn't be suprised if you see it coming. SANS and OWASP already have guides for embedding security responsibilities into software development contracts.

I've mentioned before about embedding security into business processes so that it becomes automatic. To that end, it is a large problem that requires a multi-pronged approach to resolve it.

But back to my original point - yes, education is critical. I've seen it done well and I know it can work, but like I said - you have to pick your battles. And if you're spread so thin you can't tackle it on multiple fronts, then picking the right battle becomes all the more crucial.

- J.

bimbimboumboum said...

I agree that education is an important part of ITsec. It certainly doesn't solve all the problems but it may help us avoid repeating them.


>> "you have to pick your battles. And if you're spread so thin you can't tackle it on multiple fronts, then picking the right battle becomes all the more crucial."

How do you pick the right battle?

Wouldn't it be better to say that it is more about fighting the battles you're good at winning? Implying that security is like any other discipline: if you're a great wrestler, don't try to become world champion at kickboxing.

Your company may need some kickboxing, but then they should hire some additional people instead of making their employees fight against what they are not good at.