Sunday, December 11, 2011

Grey Hat Python Review

This review comes almost a year late for a mate of mine extracted a blood oath to write a review on this book. My apologies for taking so long. 

- J.

"Grey Hat Python", written by Justin Seitz, a Senior Security Researcher at Immunity Inc, is a book which takes you through the process of reverse engineering using the Python language. He talks about automating simple tasks, writing fuzzers, debuggers, how to create your own hooks (soft and hard), sniffers and more. Does this book live up to its namesake? Overall yes, although I will say not without some disappointments- although not all are the author's fault.

Firstly, the book suffers from a timing issue. It was released in 2009, around the same time as Windows 7 and new versions of Python (2.6 and 3.0 released relatively close together). The first issue is substantial as the author assumes that the reader is working in a purely 32bit environment. Now I know that a lot of developers that do, but I think this is a dangerous assumption given that 64 bit operating systems were not new and beginning with Windows 7 (with a much wider adoption than vista) a lot more prevalent. The second is arguably forgivable, given readers could use Python 2.6 without too much issue (although older Python functions might cause grief).

With that said, the book wastes no time deep diving into the process by getting the author to explain how to setup your development environment and gives some Pythonic basics (e.g. explaining data types as they exist in Python when compared to C) and before long writing basic Python loops, delving into Assembly language and debugging basics. As you can imagine, this is not a book for programming noobs and you should have at least one language (the author assumes you have C at a minimum) under your belt before you trying this book, preferrably some Python know how. If you want a good book to learn Python, I'm a fan of Zed A. Shaw's "Learn Python The Hard Way" and the Python doco online when you hit his deliberate mistakes and trick questions.

One criticism I read about this book elsewhere is how that it is a massive pitch for the Immunity Debugger. Firstly, I think this is an unfair criticism. The tool is widely used by security researchers around the world. I admit it isn't the only one but even with Dave Aitel's introduction, it is made quite clear that Immunity have chosen to standardise on the same language and same tools to better facilitate teamwork, automation and code re-use. Knowing who Immunity are and their reputation within the industry, you cannot really argue with their results. If you take that onboard, then you accept that you're learning from masters of their craft and they are trying to instill in you their work practices and techniques, not just code. The author is aiming to teach you the process and how to optimise it. In that respect, I felt this is much more valuable and that the book certainly achieved its aims in giving you and overview of the basics of research engineering and security research.

Two areas I felt the book could have covered better was ASLR and how this affects exploit development. Firstly, I'm currently reading "The Art of Software Security Assessment" and that book was written in 2004 and touches on ASLR. Being 2009, this book specifically touches on DEP but doesn't even mention ASLR. I feel that completely skipping ASLR is a fairly unforgivable mistake. I realise this complicates the entire process but even a single paragraph explaining "No I'm not covering this and here's why" would have been better. If nothing else, the book could have stated "I will be touching upon these items in my sequel", promising to delve further into more advanced techniques and scenarios (ROP gadgets, heap spraying, sandbox bypasses, etc). Secondly, I felt the lack of any reference to 64 bit processors was also an oversight. I suspect the author began writing this book sometime prior to the 2009 release and subsequently this is what gives this book a somewhat dated feel.

Taking all that aside, the author's style is simple, easy to follow and the book well structured logically. Each chapter seemlessly flows into the next and the coding examples seemed simple enough (I'd only gone through a few chapters, certainly not all of them so I cannot state for certain that the book contains no errors). This book is quite light at 181 pages (ignoring the Index) and definitely one of the lighter reads. Although for the price ($39.95 USD) I do feel it is on the steeper side given the above issues. If you're relatively new to security research and reversing however (which is clearly the target audience), then it's probably still worth it. Being honest, these areas are not my forte so the book felt "just right" for me. Take that for what you will. If you spend all day reversing then you will find this book far too light. I should also probably point out that I didn't pay money for this book, it was a gift from our good friends at O'Reilly via. my local OWASP chapter so I have no right to complain. :)

Would I still recommend this book? Absolutely. I would also be amoung the first that would buy an updated edition or a sequel. I can certainly hope Justin Seitz writes another book (both a sequel and new edition would be great!). He has a great writing style and I think has a promising future ahead of him as a technical writer. I hope he continues to write books on this topic and explores it more fully.

Overall rating? 4 out of 5.

Tuesday, November 29, 2011


“For me life is continuously being hungry. The meaning of life is not simply to exist, to survive, but to move ahead, to go up, to achieve, to conquer.” - Arnold Schwarzenegger

I recently returned from Ruxcon and realised how much I enjoyed the conference and how much I enjoyed making new friends, catching up with old ones and making new acquaintances as well as the variety of talks which I found interesting. It always takes me a few days after to really dwell on events and consider what I walked away with. This year, I walked away with a few things - a clarity of purpose that I don't think I've had in a long time. My attendance this year triggered an epiphany for me that made me accutely aware of where my growing frustrations with certain things in my life. E.g. - what I am doing, what do I want to be doing, what would I rather be learning, balancing that with other personal commitments, financial goals, etc. I think at times, I've also directed that frustration at the wrong people in my life. So, if you read this and realise you've been on the receiving end of me griefing you - sorry about that.

I'd not posted much since my Management vs Technical post as this has consumed a good portion of my brainpower trying to figure out what route I was going to take. I've lost track how many times I've been asked the question too. So I've made the call, although some might call it taking a third option. But more on this later.

Sometimes, the only clarity we get is the realisation of what we enjoy vs what we don't enjoy. Or maybe that's the point and always has been and I'm just a slow learner.

Anyway, here's what I've learned - especially in the past two years in no particular order:

  • I enjoy learning both technical and managerial work and will create my own opportunities to learn both areas as I see fit;
  • I have realised the joy of working with my hands again (so to speak) doing technical work (after too much time largely hands off);
  • I am, and always will be, a self-directed learner;
  • If I am ever going the managerial route it will won't be for someone else's business (at least I don't see it today);
  • I enjoy talking security to non-security folks;
  • I enjoy reading technical security theory;
  • I really dislike drawing up security policies (why can't people just people buy the ISO27001 and 27002 standards and start reading??);
  • I really dislike "light and fluffy" security work without a strong technical underpinning;
  • I really dislike talking security strategy with businesses that have no desire to be strategic or even take security seriously;
  • I really want to work alongside my friends doing really, really cool stuff;
  • I want my work to leave a lasting impact;
  • My attitude and career goals and aspirations really don't fit well into a corporate hierarchy (e.g. my sense of fashion on "casual" Fridays :);
  • I have strong anti-authoritarian tendencies - at least for rules that make no sense to me;
  • I really don't want to work 40hrs a week in an office away from my family (although I'll gladly do more if I can at least be near them);
  • I have business interests outside of infosec that I want to see through (eventually);
  • Infosec is a great industry full of always interesting work once you get past the cynicism of most people in it :).
There it is. Perhaps its not much a wish list but its enough for me to give me focus.

My blog for a long time has focused very heavily on what businesses need to do to make themselves more secure. I've arrived at the conclusion that by and large, its not really rocket science (I did admit I am perhaps not the fastest learner). It just comes down to getting commitment and support from the businesses' executive manager and cascading that down. Once that commitment exists, as long as it is staffed by genuinely, well intentioned people that genuinely care about the business, things will improve. Perhaps not overnight and not without setbacks. Hell it may not even resemble anything best practise - but slowly and surely it will. Thankfully, I've been blessed to see this throughout my career enough times to know it to be true. If that doesn't exist or cannot exist, then security is doomed to fail and you're working under that management chain, you should probably GTFO before its too late. Thankfully consulting is a great way to study what you love and help others without being so constrained by those politics. So this lifestyle agrees with me - at least for now. :)

Anyway, thanks to those who helped me to find this clarity. You know who you are. My blog will start to change focus more and more to topics which match my interests. I will still write for - I must admit the time I spent here normally is being invested moreso in my articles there. Nonetheless I think my change in focus will lead to some equally interesting posts here.

- J.

Friday, September 2, 2011


Hi all

I've been severely negligent with my blogging lately. And this post is not one of my more detailed ones.

I've had an exceptionally busy few months - both outside of work and at work. At some points, you just gotta keep your head down, bum up and only surface when you need air. And that's exactly what I've been doing.

I've recently had the privilege and pleasure writing for - and so some of my time has been spent submitting articles for them. Hopefully you will be reading more of them soon. I've had my hands in a lot of pies outside of work in other non-security endeavours, which have been taking a lot of my time but no less rewarding.

A lot has happened this year, between the rise and fall of Lulzsec, Anonymous, Wikileaks and the ongoing Cablegate saga, changes in Australia with regards to ratifying the Convention on Cybercrime and more. I know this phrase gets thrown around a lot, but the world really is changing at an increasingly faster pace than we are able to adapt. Events are unfolding before our eyes that require our attention and yet, so much is happening to take our eyes off the ball.

If I can post one random thought, if what you see troubles you enough to want to act on it, pick one issue closest to you and run with it. You can't fight forest fires on multiple fronts. You can only fight one fire at time. We all have our gifts and we need to be using them in ways that can yield the best efforts for the least amount of effort.

- J.

Monday, May 30, 2011

Management vs Technical Career

 "Everybody's a genius. But if you judge a fish by its ability to climb a tree, it'll live its life believing it's stupid." - Albert Einstein

I think I held off from writing this post for a long time. But it seems to keep cropping up this discussion wherever I go, sometimes in the most unlikely of places, so I felt it is probably time to post something on this subject (given the title of this blog). I think part of me wanted to deny that it was a binary decision. But ultimately, there is no escaping it. Over a long enough time line, all people in IT, regardless of their discipline, have to make the call - will I go take a managerial path or will I take a technical career path?

For some people, the way forward with their career, their lives, whatever – is perfectly clear cut. These enviable people have a clarity of purpose that eludes many people, myself included. I wish I was one of those people, but sadly I am not.

After you work in IT long enough, you reach a point where you realise that a decision needs to be made about where you see yourself heading. I’m sure some people just plod along with their lives without a second thought, but I think anyone with a degree of foresight gives this some consideration. And as they say, a life without reflection is not a life worth living.

For those of us who travel this road of uncertainty, I can throw up a number of triggers I can say I’ve seen (either first-hand, or second-hand through observation of those around me) that prompt this evaluation:

  • Realising you are not as technical as you thought you may have been and evaluating the skills gap to get to where you want to be;
  • Realising you the most organised person in your entire team (if not division) and capable of juggling six balls for long periods of time without dropping one, even if you actually enjoy the technical aspects of your job;
  • Realising that the ongoing technical study is consuming a lot of your time and its much harder with a girlfriend/wife/kids/etc or if you want to develop other areas of your life;
  • Realising you want more recognition for your work or more pay;
  • Turning 30.
This list is not exhaustive by the way, nor is it intended to be.

In trying to see what was out there on this subject, I found some interesting links people may find reading.

(The first link in particular I strongly recommend everyone should read - and I refer to it heavily throughout this post).

Now in some of the above, I’ve seen the economic rationalisations why certain careers earn more than others. Robert Kiyosaki in “Rich Dad, Poor Dad” spoke of this with regard to what his rich dad taught him about sales. In “Technical Ladder Vs Management ladder - Which one is for me? Part 1” this is explored. I think it is absolutely accurate and fair to say that management folks should be paid more. Technical specialists may not agree to paraphrase something I once read (Kiyosaki I believe) “the larger problems you solve, the more you earn”. The simple fact is that business people, folks in management, sales, typically speaking, solve larger problems than technical folks.

Now I’m sure many will cry out and point out the guys like Zuckerberg and how he started Facebook, Gates with Microsoft, Dell with Dell Computers, Bezos and Amazon – but while these guys were technical, let’s be clear that they used their skills to solve a problem. Sometimes, solving a big problem. In some cases, more than once. And ultimately, at some point, they become managers. Can you, as a techie, go out with a brilliant idea and make squillions? Absolutely, the reality of it is that it is much harder than you think (in studying innovation for uni I can assure you that it is – by a long shot). Also, even if you do become that guy, chances are you'll need to manage. The best instances I've seen are where the technie still gets to run amok is where he/she becomes the CTO, recognises he/she does not want to do management and then builds a management layer around him/herself to take away the headaches but these examples are few and far between (Hint: I can readily think of two examples, and two alone).

Also, if you’re aiming to be a good techie, the reality is you can pretty much ignore anything to do with management and business, hone your skills until their razor sharp, go to market as a contractor and make a salary at a level only senior executives can dream about (I had one friend tell me recently he is making five figures a week as a contractor but let me tell you, he has no time to spend it).

What is the downside of working in tech? Well the time investment to maintain current is actually quite steep. Technology is changing at a faster and faster rate. People who were generalists ten years ago are specialists today. What will happen in another ten? What happens when techies get into relationships, get married, have kids, have to tend to a full house, dirty nappies, screaming children, nagging spouse, family obligations, etc? The constraints on your time will demand your attention be spent elsewhere and they only grow with time. If you aren’t doing it then for the love of the work, then you may find yourself looking for something a little easier.

But is it easier? The first survey indicates respondents from both tech and management backgrounds indicate that techies have a better work/life balance. So there it is. Does that suggest then that techies looking to find a role which has lower degree of upskilling should be looking to management at all? The survey suggests they shouldn’t. In my experience… I think it depends on the level of seniority you aspire to. Middle managers certainly, not. In which case you may want to find that sweet point where you can take the breaks off a bit and get your tech on at work and still have a life outside of work. That could be a technical role, it doesn't have to be management. If you want a salary increase but want to do the same level of work, then a line manager is pretty much what you're staring at. I don’t think this is necessarily as easy to find however as one might think. For every manager that I know that aims to cruise, I know of 5 more that do more work outside of work than they ever did as a techie.  Then again, the oldest techs I know still spend considerable time on the tools as much as because they love the challenge (and as much as because they're surrounded by ineptitude and can't trust anyone else to do it) and will probably work themselves into an early grave. But, based on my anecdotal experience, I'll give techs the benefit of the doubt on the work/life balance equation.

So lets assume that money isn’t really a consideration as to whether you should move into the manager track or not (given that it is unlikely to be a deciding factor unless you plan on becoming C-level and drawing in a bonus or equity payments ontop of salary). Also, lets further assume work/life balance isn't an issue, with techies again pulling out in front. Why would you move to management? 

Influence, according to the survey. 73% of managers felt that they had more pull over corporate decisions where as only 45% of technical staff felt the same. I’ve never been a manager directly, so I feel I can’t really say, but I can say I became a lot more interested in it after working under the sufferance of some really bad ones. Once you see the benchmark for exceedingly bad management, it can certainly inspire you to take that route, even if for no other reason than to save you from someone else’s incompetence. That, to some extent was why I started my MBA. Now while I enjoy the learning as I go through the degree (I’m a sucker for learning in any form), the most I can say I’ve gained so far is that now I am even more critical of bad management decisions than I ever was before. Although, I'm sure all of us at some point think we could do a better job of something than those above us. :) I would agree with the above - wanting to have a greater say in how things are done is the greatest motivator for people moving to a management role.

Whether one should go technical or managerial, I don’t believe that you need to be perfect at one or another to make the decision. I do think, however, it requires a commitment. And therein lies the rub for those who are straddling the fence. 

The irony is, if you have decent skills, even if you are not the best in your field, but you are committed to a lifetime of self improvement and passionate about what you do, taking the longer view, you will be fine. As the survey points out, technical experts are usually shielded from a economic downturn and rewarded in the upturn. Having seen this first hand, I would concur. Having said that, I’ve seen a lot of technical staff who were not at the top of their game culled when times were tough. I take the survey to be a caution to those who aren’t good at what they do to really lift their game, or failing that, channel their skills into an area where they will succeed. I have also heard of stories where people who were not technically brilliant had skill beaten into them over time. I have heard this repeatedly by technical people (including technical managers) with over thirty years of industry experience and I really trust their judgement on this. This suggest that technical skill can be trained (even if critical thinking and analysis are harder to come by).  

So, assuming, tech skills are easier to acquire and maintain, you get better work/life balance, near equal pay (assuming you get a contract role) then why else would you go for management, if influence on corporate decision making wasn’t a factor? Maybe you are just playing to your strengths. I have one friend who is a manager and making a good career for himself of it and for the longest time, he wanted to move into a more technical role. I think it has taken him close to two years of full time management (and many years prior of 'delegated' manager) for him to start to accept he is really very, very good at what he does, even if it is not something he would have necessarily picked for himself. Is he happy? Hard to say. I don’t believe he is miserable, that’s for sure. He seems to have a good work/life balance however (for the most part).  

The stats in the survey also suggest this is the case, with only 21% of managers actually stating that they would prefer to go back to a technical career. However, as a caution for those who go this path, “lack of technical experience” stemming from a “premature management” jump to management as well as “financial packages too established to risk” were the main reasons cited why managers wouldn’t jump back. So if you’re thinking of a career in management, the survey indicates that 11-15 years in a highly technical role is recommended before you switch to management because it allows greater flexibility in switching back to a technical role during a harsh economic climate. Managers with less experience technically will not have this degree of flexibility. Also, start socking more money away if you move into a senior role and be sensible with your cash.

Some people, like my mate, I think are drawn to management because they organised. Some because they are natural leaders. One mistake many make is to draw parallel between the two when infact they are not synonymous. It has been my experience that there are managers out there, but very few are leaders. Some people with these skillsets are better drawn to management and infact suited to it. Should they do these jobs however if they still enjoy the technical aspects of their role? What if they excel in these areas? Should they then play to their strengths and do what they do well? What if they don’t like it?

I’ll be honest, I don’t have the answers on this. I don’t even know if anything I’ve said above can help but I certainly have some advice which I think will be helpful no matter where you are on this path. These are things I have found helped me to get some focus on my path:

1)      If you haven’t tried management, it might be worth trying it before you knock it.
According to the survey above, 48% of respondents with previous management experience were less like to change jobs for a crack at a management position as opposed to 66% of those without management experience. This suggests statistically that the grass is greener on this one. So let’s say you try a management role only to find you utterly hate it. Awesome! Would you actually say the experience is wasted if you can make an informed decision on whether it is for you or not? 

2)      Work in different roles (in different companies)
The same survey suggests job rotation is the key to a long career within an organisation and I can certainly say I’ve seen that within Dimension Data. There are more people have been here for 5 years - 20 here than any other place I’ve worked (and I’ve worked for universities extensively which have amazingly long tenured people). I get to work with people who have some amazing skillsets that you simply don’t find in other places. One guy I work with has worked as an architect, engineer, consultant, system admin and a programmer and I don’t think I’ve ever met anyone with a more interesting technical skillset. 

Even if you don’t change jobs outside often, job rotation within your existing employer can be a highly rewarding experience and from a purely managerial view it actually makes sense. That said, I believe working in multiple organisations is something you need to do, hence I’ve added it to the above heading. 

I believe you need to work technical roles in different environments to truly get a sense of your worth and whether or not that career ladder is for you because you need to get a sense of where you rack up against your peers. Don’t listen to what one organisation, or one person, tells you about your skillset. It is very easy to become pidgeon holed and listen to what someone else tells you what you are worth and not what you believe you are worth. That's not just sound career advice though, that's life advice (unless you suffer a mental illness or the Dunning-Kruger effect).

3) Get leadership experience outside work
Two of the above links suggest leadership experience outside of work might be beneficial for technical people to get a taste of whether it is for them. The reason doing it in an environment outside of work is that there is less career impact incase it goes wrong. I can’t say I’ve tried this, but it would be remiss of me not to include it as an option and it does make sense.

I have one friend who has been co-founder and lead developer on a major project. That is a good example.

4)      Do what you love
I am a firm believer in the phrase “do what you love and the money will follow”. I have friends who have worked in some amazing careers. My friend Darren over at Stylus Monkey is a good example of this, and to this day I think of him as my role model (BTW follow him on Twitter and read his blog – his posts are relevant to anyone, irrespective of their role). Whatever you do, you must do with passion and Darren is the embodiment of living your life with passion. I admire him for having the balls to follow his dreams. Most people never do. You may take a few different roles to find something that stimulates you. But whatever it is, find it and run with it.

5)      Be the best at what you do
All the evidence suggests that during an economic downturn it is the star performers who are shielded. If you are passionate about what you do, strive to be the best at what you do, you will receive due compensation for what you love and those above you will look out for you.

Like I said, I don’t know if this helps anyone. But I think reading the surveys, reading experts who have walked down this line and talking to other people who have made this call and quizzing them over the how, when, where and why they made the call is always beneficial. Finally, irrespective of what side of the fence you fall on, I hope everyone reading this at least made the decision to be a leader in their field.


- J.

Sunday, May 22, 2011

VUPEN vs Google and the consequences for IT Security

I've been largely pre-occupied with other work as of late (read: university assignments), but not wanting to discuss AusCERT I did however want to touch on the VUPEN vs Google debate.

For anyone that had missed this, simply put, a French security research firm claimed (displaying video footage) that they had cracked the Google Chrome sandbox, allowing arbitrary code execution. What makes this news worthy however is their refusal to disclose the details to the vendor and only provided the details to their clients (mostly French government, law enforcement, military types) who paid their subscription. For what purpose, we can only guess.

For the overview, see below:

Now, I'm not getting into a moral debate surrounding disclosure, but what I wanted to highlight firms selling 0day vulnerabilities is not new. Endgame Systems was dragged into the limelight when HBGary's treasure trove of emails was leaked to the public when it was revealed a multi million dollar contract for selling zero day exploits. This is perhaps the first instance of where a previously known but underground industry practise was exposed very, very publicly. Based on the content and context of these leaked emails, we can only presume they were developed for offensive (largely illegal) purposes.

What I found largely interesting is the lack of political fallout for Endgame Systems. On the contrary, apart from a handful of negative news, most of the rage was directed at HBGary. But the point I'm trying to make is that Endgame Systems, very carefully and deliberately - did not want to draw any attention for these activities. Yet, VUPEN is perhaps the first firm we've seen who has taken an active stance to promote their technical capability in the production of 0days. And to repeat myself, it is ironic that they are copping all the rage from the public for their disclosure while Endgame Systems does not.

This raises two interesting points of concern -
One, the relative hypocrisy of an industry that is willing to slam one company for openly acknowledging their capability and is punished yet another seeks to hide it and goes relatively unscathed;
Two, and perhaps more importantly, could this represent a new era whereby exploits will begun to be sold openly?

On the first point, I have my suspicions but it seems odd that it is perceived as socially acceptable for one country to have this capability and yet, not for another. It reminds me of the nuclear arms proliferation debate really.

On the second point, the publicity stunt that VUPEN pulled presents something we've not seen before (at least, not that I can recall - please correct me if anyone can think of a more public example).

As the legalities of security research in which vulnerabilities and exploits (sorry, "Proof of Concepts" :P) are created vary from country to country. This means that their "legitimacy" (perceived vs real) could have a very profound and transformative effect on the IT security landscape.

Will this force IPS vendors into bidding wars for 0days to update their signatures? Will this force vendors to create pay money as a form of "hush money"? Will governments seek to impose a tax on software developers who create faulty software or perhaps a more likely outcome - rushed legislation to ban vulnerability research and exploit development (further driving up the black market value and driving the industry further into the dark).

All food for thought but I would not be suprised if this activity becomes increasingly commonplace.

So yeah, watch this space.

- J.

Wednesday, May 4, 2011

The Risk Management Lie

Rumors of information security evolving as a process and an industry is really a mixed bag. On one hand, I’ve seen first hand the benefits of improved governance. This helps to ensure people can’t make adhoc changes to production environments and should those environments change outside of authorised change windows and there is no corresponding change record, the change was unauthorised.

On the other hand we still don’t track risks well. We don’t REALLY understand them. We don’t classify them well. And for those who are able to do this even partially well, their Excel spreadsheets fall far short of the capability of tracking chained exploits and how can lead an enterprise to ruin.

The models we use to track IT security risks are – to my thinking – like soothsaying. It reminds me of witches from centuries past, sacrificing chickens and casting bones in an attempt to augury the future. For all our metrics, for all our “likelikood x impact = risk” crap, we may as well be doing the same.

Is it wrong to use these methods? Well, no. At the end of the day, something is better than nothing. If these methods at least give the organisation you’re working with some sort of awareness of the risk you are taking onboard by choosing not to remediate a finding, then that is a good thing.  Not undertaking a risk assessment is like making a call not to bother getting your car serviced since you were overdue three months ago and the car is still running fine now. Sure, it might SEEM fine, but that’s until the wheels fall off (so to speak).

Three issues I see as being critical to the failure of risk management as a discipline (specific to information security):

1) Inability to track and measure vulnerabilities which can be chained
The lack of organisations to full understand chained exploits and how they can be exploited (and even security professionals might easily miss how someone is able to chain them too I might add) is one of the greatest limiting factor of risk management.

We can talk about a missing patch or an exposed, vulnerable application and explain what is the business impact if that is compromised. What we can’t do well is look at all the other vulnerabilities in the environment and suggest methods how or why that singular event could be triggered by other risks inherent to the environment.

2) Inability to accurately define and measure TRUE risk
I’ve seen discussions over which risk management method is better. I’ve seen FAIR advocates. I’ve seen ORM described as the defacto standard (Ostrich Risk Management for the uninitiated). At the end of the day its all the same. We really don’t know. Risk management is really the art of sticking your finger in the air, applying models that do not translate well to IT risks and taking a quasi-intelligent guess at the risk.

This isn’t a field like say the incident of cancer or natural disasters are tracked for insurance purposes. We don’t know how often SQL injection is exploited globally on a per IP address basis. We have even less data to narrow the field of inquiry down to a geographic region, let alone how often we see a singular enterprise being hit (if you’re one of the few organisations that do track this sort of data, you have my respect). What about risk impact? When someone asks you what is the likelihood that SQL injection will occur on a given server, what are you referring to? The fact that someone uses SQL injection to copy the data and onsell it? Tamper it and hope you won’t notice? Trash the server? Or use it as a bastion host to conduct further attacks? Do you draw up risk assessment for each scenario or only one? Each example given has a highly variable risk, depending on the business purpose of the application that relies on said database. Some organisations (e.g. agencies within government/military) may rely heavily on confidentiality. Others may rely on availability (e.g. banking). Others yet again, integrity (e.g. universities).

3) Ongoing treatment and management of risks
Ultimately I see very little organisations and businesses can do to address the first two points, apart from being aware of the inherent limitations of risk assessment we use today and try to keep an open mind with them. This third point is something you can all do today.

In many instances I’ve seen, risks are often stuffed into spreadsheets with signatures and never again touched. Hey the business accepted the risk – it’s a done deal, right? Well no, not exactly.

Risks can change. You need to review them. In the course of a year the environment may have changed radically. Or perhaps even superficially, but in either case, enough to introduce new risks which could alter the original risk score. This is where an opportunity exists to better understand how chained risks can be introduced into your environment.

Sometimes, all we can do is go back and review the risks that have been formally accepted. If you can go back and see there is a series of risks that could be chained, use those to propel your security program. If you need to, develop a proof of concept. Show the business what these risks can mean to their business and how they can be exploited. Explain who would want to target them that way and why – particularly if the risk warrants it. You don’t want to take this approach with every risk, but if you see that chaining presents a greater aggregate risk than whats on your spreadsheet, then you have an obligation to speak up.

Overall, I wish I had a solution to risk management in an infosec world – but I don’t. I don’t like how the process is governed by auditors who see each vulnerability as a discrete risk without any perspective of the larger whole. All we can do is point out these limitations and try to work with them.

In closing –
Don’t accept risk management will solve your problems. You still need to find the problems first. You still need to understand them, capture them and take ownership of them. Even then, you still need to be mindful that you may have overestimated or underestimated the scope of the problem. And this is why you need to constantly review them.

- J.

The best defence is a good offense

I recently read two articles that made me consider is the goals of cyber security shifting - or perhaps more precisely, could it shift? 

The articles:

There's a heap relating to China that are worth reading on Threatpost - in particular anything relating to Dillon Beresford's dubious "research" into China's security.

What is emerging here is rather scary pattern - it would seem (at least based on the media at hand) that China are pushing an offensive security agenda as not only part of a national defensive strategy, but also an economic policy for national benefit.

It's a no brainer that cyberwarfare offers truly asymetric capabilities. Success is not based on which force has the larger army or resources to throw at it but often those who have the most skills and display the greatest intent and capability in using them ("who dares wins" indeed). Economically, this is an awesome capability too. I read a report on innovation (by Cisco) awhile back (sadly no, I cannot find it dammit) but one of the things that was discussed was how it was a known problem that certain countries (for illustrative purposes, yes, China was one of them) do not innovate as well as others, so they have a tendency to reverse engineer other products or get designs from other countries by any means who have already done the innovation. 

Now in an outsourcing model, a firm has already done the hard yakka on innovating - they just need to find a firm who can produce the good or service as cheaply as possible. However, if a firm is willing to steal those innovations from a competitor and beat them to the market, that has the potential to kill your competition. Money wasted on R&D that they were hoping to reclaim on future sales that will now never happen.

What the first article is referring to is China's willingness to promote itself as a superpower and gain advantage through every means, by basically stealing IP, economically crippling their competitors all without firing a single shot. Or taking your enemy out pre-emptively if you wanted.

The second article suggests that culturally they face significant challenges with defending their home systems. For example, the lack of peer review for their software leaves it potentially wide open to bugs. Equally, reporting them can create a loss of face (in more ways than one).

This just got me thinking - what if this means that China's actions on offense are due to the realisation they have defensive issues that aren't going away anytime soon? What if this means they were on the offense because it just made more sense - its a lot easier to kick in someone else's door than it is to guard your own? Especially if you know that in doing so you're depriving resources away (from an already taxed and resource starved adversary) that might normally be spent attacking you.

Again, I don't want this to degenerate into an Anti-China post - that's certainly not the point. It is meant to be a discussion signalling a shift in cyber security strategy. Is it possible for nation states, even corporations, to eventually move away from a defensive strategy and rely purely on offensive techniques since they will yield more fruit (albeit at greater risk)?

The US - and many other countries - are more than aware that their cybersecurity capabilities are thin at best. Would it not be in everyone's best interests, to then switch to an offensive approach when you consider that the results of such an approach would yield a higher degree of success?

I haven't put too much thought into this, but I am curious what others might think on this.

- J.

Sunday, April 17, 2011

Advanced Persistent Nonsense

Lately the threat posed by APT has gained a lot of attention. As highlighted by my April Fool's Day post, there was the RSA incident, the HBGary incident (which was more "Persistent" than anything else), the Australian PM's laptop getting owned (which barely got more than a day's press), and more.

I wanted to really try and summarise what APT is and what is the real takeaway message for people out there.

Advanced Persistent Threats always have been there.
Always. Without exception. You may not have known them by this name. You may never have heard of it before. That doesn't mean these attacks weren't going on before. The point being, they are not new. The idea that people have never been hit by a highly skilled, motivated attack who was never going to stop until he gained access to your network and got what he wanted is insane.

Sure, the label is popular when talking about state actors, but it is incorrect to assume that if the attacker isn't state sponsored or backed, that it isn't APT. That's not to say that are not in some cases either....

In anycase, the point is, these kind of attackers have been around since the dawn of the Internet.

You cannot stop APTs
To quote one of the Sourcefire VRT guys, Matt Olney, in what is probably the best definition of APT I've ever seen: 

    "There are people smarter than you, they have more resources than you, and they are coming for you. Good luck with that."

Firstly, we need to really revisit what we know of information security. If there's anything we've learned to date, it is that nothing can be made 100% secure. The best result one can seek is to delay, defer or hinder an attacker to otherwise induce them into targeting an easier target. If we accept that as an axiom, then we can further conclude that if the attacker is only after you, then you're pretty much screwed given sufficient time and skill (or at least time enough to get skilled).

Yeah, good luck with that indeed.

Most people are worried about the wrong thing
Working in security - particularly consulting - you get to see a lot of different businesses and how they run and implement security. The sad reality is that most people do a poor job of it. I don't say that to insult anyone, it's just a simple statement of fact. Regardless of reason, too many business lack the basics controls. Too many people can be taken down by a script kiddie with a fresh copy of Metasploit. Too many people don't have a good handle on the basics, like patch management. Too many people still running around with IE6 as the default web browser.

And yet they want to focus on APT?

If you don't have down some basic processes - stuff like patch management, vulnerability management and proper logging and event management, if you don't pentest your environment hollistically as well as individual projects from internal AND external threats, then there's really no point. You need to learn to walk before you run.

Also, in understanding APT we're talking about a hacker doomsday scenario where you are going to get owned and there is very little (if anything) you can do to stop it. In most cases, the best you can ever hope for is a shot at detecting the attack in progress. This is what RSA and Google were able to do. And how do you think they did it? Uhuh that's right. See above. This is why I laugh at people bagging these businesses getting owned. Everyone can get owned. But when the shit hits the fan how they deal and respond is key. The fact they could detect these attacks gives an indication of their level of security maturity. Could you say the same about your business? Uhuh, I didn't think so.

So in conclusion - let's get real with the problems we can solve. Don't worry about the hacker doomsday scenario that you cannot prevent. Focus on getting the basics down first. Once you get a handle on those processes, then we can have a chat about APT.

PS: You'll still get pwn3d though.

- J.

Friday, April 1, 2011

I give up

A number of events in past months have forced me to reconsider my position on a number of issues.

Foremost in my mind:

What do these all highlight:
- Ultimately, the blatant disregard that Australian citizens have towards their own privacy,
- Similar disregard by our government to protect its information assets,
- Gross misunderstanding over what SQL injection means (in spite of seeing the ownage of HBGary),
- The fundamentally flawed architecture we come to rely on (SSL & CAs) .

We (Australia) have no mandatory notification of breaches, no penalties for privacy breaches, weak government interest, capability and skill in securing our own assets.

I am tired of banging my head on a wall and hoping that things will get better and that I can play any role in that future. I have blogged about it in the past and I hoped this was just a phase and I’d get past it. But I can’t and I’m over it.

A wise man once said to “follow your bliss”.  Once upon a time, I loved pillaging boxes and finding holes. Somewhere along the way, I lost my path. I betrayed my own values. I got a set of programming books piling up in my library I’ve never touched because I’ve been so focused on architecture, strategy and business. None of this makes a lick of difference in the scheme of things and quite frankly, if the Australian population and our own government don’t give a shit, I frankly don’t see why I should.

Sure I believe you can focus on architecture, sound decisions based on risk and intelligent approach to understanding your business and uplift your security. But at the end of the day, its a drop in the bucket. When people think SQL injection doesn’t mean much and lazy administrators can’t be stuffed patching Windows boxes because it involves work (err... “downtime”), then nothing I do will ever really matter. Whats more, nobody seems intent in addressing these problems. There is no patch for human stupidity (or laziness for that matter).

So I’ve decided to “follow my bliss”. I will be focusing on penetration testing, vulnerability research and programming solely from now on. I will get back to my roots – and I am the first to admit I have a lot of things to catch up on. But I will enjoy the journey at least. I will devote myself to finding the holes. Let the fixing go to other people who have the stamina and the patience to do it – I’m done with it.

To the pentesters I’ve hassled – you guys were right all along. I was wrong.
To the people who supported me – sorry guys, but I’ve seen the light.

Sorry y’all.

- J.

UPDATE: For those who missed the date of the post, yep I'm pulling your leg. Happy April Fools Day y'all.

Wednesday, March 16, 2011

I love WAFs and so should you.

Warning: I work for Dimension Data and we're a known reseller of Web Application Firewalls. If you think that alters my opinion on this subject, then I suggest you stop reading.

Lately I've seen some hatred thrown towards Web Application Firewalls. Some of it I think is misguided, some of it misunderstood. However - and perhaps more disturbingly - a great deal of it just strikes me as wrong. I am of the belief that much of the antipathy thrown towards these things comes from people who have not worked in enterprise environments where application development is a common part of IT. This means there are a lot of assumptions on the degree of difficulty in either patching an enterprise application or the speed in which a hotfix can be applied. This post is aimed at reviewing those assumptions and clearly separating fact from fiction.

Firstly, I would encourage everyone to read OWASP's Best Practise Guide: Use of Web Application Firewalls to ensure we're on an even understanding. This is a really good read and I think gives anyone with even a foundational understanding of web application security a good overview of their strengths and limitations.

One of the constant criticisms against WAFs is that they are essentially a blacklist model which applies pattern matching based on a known vulnerability ("known bad") which is always putting the defender on the back foot. Further criticisms highlight that the time spent configuring the WAF for that bug could be better spent fixing the application. I might have missed some so if I have, please fire away.

To draw this conclusion you would would have to be operating under a number of assumptions (so I'm taking a bit of a punt here, based on conversations I've had or my take on what these people are thinking - feel free to clarify if you think I'm mistaken):

  1. It is equally as quick to apply a fix to an application as it is a WAF
  2. The cost in time in implementing a hotfix on a WAF is equal to that of fixing the original defect in the code.
  3. Applying a hotfix in a WAF and the code is "the same thing."
Now going back to the above link:
"The main aim in using a WAF is therefore securing the existing, often productive web applications, where the required changes within the application can no longer be implemented or can only be implemented with a disproportionately large amount of work. This applies to vulnerabilities in particular which have been revealed via a penetration test or even via analysis of the source code, , and - especially in the short term - cannot be fixed within the application."

The emphasis above highlights the fundamental aim of the WAF I want to drill into everyone's head. They primarily exist to as a temporary solution until the code can be fixed. That's it. Nothing more, nothing less.

Now to challenge the above assumptions -

1) Code changes are NOT always quick and easy to push through to Production.

If you disagree, then I KNOW you've never had to deal with developers, QAT, change control or project managers and navigate any number of hurdles that impact with a code release (especially during crunch time). Now I am the first to admit that what follows are some bullshit issues. Nonetheless, they are REAL issues and if you want to improve security within enterprise applications then you need to realise that these questions will come up:
  • Is there a project code for the application developers and testers to charge their time against?
  • Does your hotfix take developers away from working on mission critical functionality for a given project?
  • Once the fix is applied who is doing the retesting and will the re-test delay the release to Production?
  • If you plan on cutting corners (regression testing is usually the first victim) because of the "urgency" or you "believe it won't impact the rest of the code" to get your hotfix into Production, who is accepting the risk?
  • What is the review period for all change requests to Production assets?
  • Is there a change embargo in place at present?
You have to understand that if a code change leads to a defect in existing functionality, you (as the "security guy") will cop the blame - not the developers. So it really is in your best interests to NOT cut corners on any of this.

Having said that, let's presume for a moment that you do not have to deal with any of these issues and I'm horribly, horribly wrong. That's fantastic for you, but we also have to consider the fact you work in an environment that does not even pay lip service to ITIL, COBIT or any number of best practise frameworks with regards to the management of IT infrastructure. This means you have no assurances at ALL that unauthorised code changes haven't taken place already.

Or put more simply, you probably have far bigger problems to worry about.

2) Time spent on patching an application is NOT equal to configuring a WAF.

For starters there are simply less people involved. Typically you have the business owner for the application in question, maybe an engineer in Networks as well as Security Operations to conduct the change and some Change Management folk to review the whole thing. Infact, I've seen this done in enterprise environments with less than 4 folks involved. Notice how there are no development staff and application testers involved? If you assume you would normally have another two people involved from each of those teams minus the Network engineer. That's being very generous too. Normally the process can take much longer (see point 1) and involve many more people. This means that applying a fix at the WAF could (taking the above example), reduce the cost of fixing by 20% or in large cases, as much as 75%. You can also argue security fixes can be applied as Emergency changes, vastly reducing the time window from the change being raised to being implemented. Time to test and implement? Much quicker.

In practise too, what will happen is that you will more than likely bundle up multiple bug fixes and roll them out into a single release. This means that the testing cycle is much longer, as is the release and the actual cost. God forbid the defect is significant enough to warrant a major re-write of the entire code base. You can then take the above numbers I've given them and throw them out the door. You're then in a position where you are still going to have to prioritise your time and effort and in all probability, you'll run with a WAF rule in the short run because the prospect of leaving that application without any form of defence in the until the code itself is fixed is something neither you or the business owner is wanting to chance.

This leads to my next point.

3) Applying a hotfix on the WAF buys you time (if you're lucky). It does not solve the problem.

Does this mean you are better off applying hotfixes in the WAF than in the application? God no. You need to fix the root cause which is inherent in the application. But if we can accept we can get a temporary fix into Production on a WAF quicker than we can by applying a hotfix, then the use of a WAF becomes much more palatable. It is an acceptance that you are buying time for your application, not solving the problem.

If a virus somehow takes spread on a network what do you do? Normally you would have desktops in segmented zones to contain the spread. In essence, you are using the firewalls to contain it. This buys you time until you can determine the root cause and remediate. WAFs are no different - implementing a change on these appliances buys time for the defenders to fix the root cause.

We block ports because we don't want people accessing applications carte blanche. We accept that there is some network traffic we do not want. Why is it so hard to apply the same logic to our applications? Nobody refutes the fact that the root cause of a vulnerability must be fixed (unless they have rocks in their head). But we live in a world where not everything is black and white and fixing problems is infinitely more complex than finding them. This where the entire concept of "defense in depth" has arisen - the knowledge that we build in layered to defences to prevent, mitigate or slow down an attacker from compromising an application or system.

To not accept that a WAF plays a valid role in a defensive strategy is to not only deride the entire notion of a "defense in depth" strategy, but it also devalues the role of other security technologies we use every day. Everyone in infosec sneers at the notion when someone says "we have firewalls so we are safe" because we know that is false. But no-one in the field in their right mind would call the technology useless. Nobody in their right mind would call airbags alone useless but we'd certainly recognise their value in road safety when combined with other safety mechanisms within a car - e.g. brakes, seat belts, etc.

Additionally, if one is in a position whereby your application is a black box (where you essentially have no access to the code) then the WAF might be one of the few defensive mitigation strategies at your disposal. Does that mean you should leave an application you cannot patch or secure long term in your network? Ideally not. It certainly beggars the question how you found yourself in this situation in the first place - i.e. why is this not covered under your support/maintenance contract? And it does happen - typically with large common off the shelf (COTS) applications but in such an instance, your long term strategy might be to migrate away from that vendor and product so you can get rid of it entirely. In which case, the use of a WAF is still a perfectly viable strategy since fixing the code base isn't an option. Ultimately this is arguably the worst case scenario one can find yourself in too.

Sure - one could build an argument that if you were to build a secure box and application out of the gate with no unnecessary services and plug it into the Interweb it would hold up. I could also assume no new attacks will ever be developed, no vulnerabilities will ever be found in that environment ever again and that Palestine and Israel will come to a peace agreement in the next 12 months. In all practicality, there are some things we have to accept aren't going to happen in a hurry no matter how much we'd like them to and we work with what we've got.

Anyway, here's something to think about for the anti-WAF zealots out there.


Thursday, February 17, 2011

Economic benefit: Build vs Break

I have one friend who I swear, is trying to inflame me with my whole build vs break rhetoric. He knows who he is, so this post is for him.

Recently events in the news, finishing economics, and some other personal events has fired me up enough to forego my original post on WAFs (for now) and discuss some economic basics again. Mostly some random idea I have been toying with, applying some economic theory to common problems. I don't know if this will solve anything - some of these ideas are very much in their infancy but perhaps by putting it out there, someone else might take the ball and run with it.

Basically, the economics of security are stuffed. I don't mean just "slightly broken" - I mean completely, utterly and currently, irrevocably stuffed. To try and phrase it as an economist might, the marginal cost of fixing software exceeds the marginal benefit - no matter which way you slice it or dice it. I know this isn't revolutionary - David Rice in his book "Geekonomics" covered it pretty good (apparently - I haven't read it in its entirety yet). But from what I can tell, "building" (as I refer to it) is dead.

Yes that's right. Building is dying.

I've been asked (as recently as today even) whether I think its dying. I always say that same - no it never will. I have always maintained that. But I guess I've been a lot my critical of my work lately and what I can do to improve what I do.
I think it has been for a long, long time but none of us really paid any attention.

I'll try to illustrate with some examples:

On one side of the fence, black hats make uber money and get off with slap on the wrists:

This is one fraudster perpetuated a $10million USD heist, on a scale unprecidented in human history - 280 cities, 2,100 ATMs, all within 12 hours. His punishment? 2 years suspended sentence.

Entire towns loaded with cyber criminals driving Mercs:
The cops themselves acknowledge:
  “You arrest two of them and 20 new ones take their place,” he said. “We are two police officers, and they are 2,000.”
Of course, it doesn't stop there. It's now being reported that fake AV companies can make more profit than legit ones

If you don't want to move into fraud - no problem. There's a huge black market for vulnerabilities, databases, malware, botnets, pwn3d hosts, etc. You name it. Just leave the moral conundrum at home, do your work, enjoy the craft and don't ask questions about who pays for your warez.

On the other we have conference after conference after conference, celebrating security researchers whose primary objective is to break all security that is created. It used to be that the idea of breaking stuff was to find ways to innovate and make it better. Somewhere along the way that all got lost. How many good conferences are there where interesting ideas about building and creating are there? I can think of only one and its largely unsung to the best of my knowledge (yet looking at the lineup of some of the speakers you know there are some legends in attendence). Is it no wonder we are making no progress?

If you want to make money building, you're options are to open to the public (Open Source) be a pauper but get some recognition. Unless you are willing to build a product and sell it, commoditise it (WAFs, firewalls, etc) it just becomes Yet Another Product, which creates its own issues. If you want to make money however, there's plenty to be made. Just look at Mozilla, ZDI, IDefence, and so one. They'll all pay you to find the holes.

But you know what - let's assume that you dismiss all that, you decide to build stuff just for the love of it all. Really, whats the point? Take a look at the tragedy that is the NSW Privacy Commissioner's findings into Vodafone. They don't even really take action, even when its proven that a company acted negligently. Economically, you can applaud Vodafone's actions. They took the cheapest, lamest, most pathetic way out (changing passwords every 24 hours). Forget VPNs, forget two factor authentication. They did it El Cheapo and the Privacy Commissioner said "yep, good enough." As security professionals this is an utter disgrace and our own efforts as an industry are actively undermined by government.

Unless the incentives are reversed, unless companies are finded for insecure software, vulnerability researchers then actively rewarded for finding bugs using the taxes collected from vendors, then the driver to innovate, improve and truly create will never really happen. This would disincentivise firms into producing bug ridden software, entice legitimate security research and spur more spending to areas where it is truly needed - better APIs, better education, better business practises and processes, etc.

But, until that day comes, you are economically better off breaking. That's just a fact. You will probably have more fun. You will make more money. Get more recognition if that's what you want and worst case scenario, if you find yourself lining up for unemployment, you know that you'll never go hungry. Ever. Unless an asteroid hits earth, destroys the Internet and sends us all hurtling back to the Dark Ages but if that happens we have bigger fish to fry.

- J.

EDIT: As a postscript to this, I remember when I used to work in Network Abuse, there was a story from one of my team mates who was chatting with one of the big time spammers at the time as they had infiltrated some of their private forums. My team mate asked the spammer over chat one time "aren't you afraid of going to jail?" The guy replied "I am 21 years old, I have $2 million in cash, in garbage bags, buried where no-one will find it. Even if I go to jail, I'll serve a minimum of two years in jail in a white collar resort. I'll then get out maybe in 6 months with good behaviour, move to Mexico and retire." This is stretching back a bit now, but the principle still applies.

His (the spammer's) point was that the laws were not sufficiently harsh to punish his crimes that it was worth the time to do the crime. Comparing it to modern day fraudsters, we're at the same point. If you get caught in a Western country, you'll do big time. But more to some country in the Balkans, Russia, Romania and chances are, given the levels of corruption and organised crime, you'll probably be fine.

Tuesday, February 1, 2011

Why IT must be run as a business

I recently read this blog post on Richard Bejtlich's blog (and I am a bit behind the times) but it really rubbed me the wrong way. I am probably misinterpreting the point of the post but the way I read it, Richard was just pointing out that there were some salient points. I guess that I read the points that he highlighted and found that they were either people squabbling over semantics or they were IT nerds that had been promoted to management roles and somehow thought that they were unique, beautiful snowflakes and were different or more important than any other business function.

How is IT any more "special" than say marketing, sales, finance, etc? They aren't. 

I want to believe that the aim of the article was to say that IT should seek to be a trusted advisor to the business and serve the meet those ends, but it really read to me like IT should be able to dictate terms to the business and demand what they want. 

Now I've worked in some places with IT departments that have been described as an "post apocalyptic backwards IT environment". And those places were paradise compared to some of the hell holes I've seen since. And the worst ones I have ever seen are those that locked into this mindset that they can dictate how/what/why/when and where the business can do what it wants. They dictate what laptops they can use, what applications they can use and so on and so on. 

Now don't get more wrong, I understand the reasons why this is necessary to some extent: ensuring standardised operating environment, maximising pricing benefits, maximising process efficiency and so forth. But seriously, if IT is going to be the lynch pin to the business, then dictating terms is the worst thing you can do.

C-level executives are keen to reduce cost and focus on numbers not for the sole purpose of "looking good" to the board or shareholders. They know that but reducing their marginal cost of production, they are able to produce goods at a lower opportunity cost than their competitors. This means that potentially they are able to put their competitors out of business just on pricing alone. And this is simply one tactic can use to crush competition.  So any CIO looking to gain efficiency is going to look at reducing the size, complexity and operational overhead of their IT infrastructure, applications and staffing where ever they can. I know I would. 

I remember meeting one client ages ago who is quite well known for reducing their IT size to an almost infinitesimally small size. At the time I met this client I thought the concept was in itself, appalling. After knowing what I know now, going through my degree, I'm convinced this guy is actually a visionary. 

These goals are a primary driver behind the booming enterprise architecture industry which seeks to bridge IT and business by optimising business process through efficient, robust, scalable and re-usable architectures. Any CIO or IT manager worth his weight who is not seeking to optimise and consolidate and cannot rationalise the cost benefits of doing so, is doing his company a disservice. For every IT manager that fights to retain infrastructure in-house, even when it is more expensive in doing so is also harming the longevity in the company by forcing them to spend money on an area that isn't a core competency.

And for all the bitching in this article that IT isn't focusing on innovation, I will tell you this: For every dollar your company spends retaining and managing IT assets, that is one less dollar your company is spent doing really cool, innovative, exciting stuff that is core to your brand. And for every dollar you spend maintaining something that isn't core to what you do, that's potentially a dollar that your competitor is gaining on you.

Now before I'm stoned to death by my infosec peers, I'd argue that our role is to acknowledge that progressive, forward thinkers are out there and we need to acknowledge that stopping the move to cloud based technologies (IaaS, SaaS, etc), outsourcing, etc, is comparable behaviour to people throwing sabots into textile looms back in the 15th century for fear of losing their jobs to automation. 

Are we as an industry really that unevolved and immature?!? Why can't we look at our methods for ensuring that information assets are adequately secured as part of the migration, that they are managed appropriately and in the even they simply can't, that mitigating controls are applied as best as possible and residual risks are understood by all so there is no misunderstanding?

I understand that internal chargebacks are not popular and I understand the argument about the chilling effect it can have, but simply put, this is good economics. It simply proves the point that the business is wasting money on a service that it can get from a third party provider for less (that is of course assuming that the business is comparing apples with apples and not say, a fully redundant SAN storage with a USB hard drive from Dick Smith).

If IT really want to talk innovation and do really cool, exciting stuff, look at how you can get rid of those crappy legacy applications in your environment that are unpatched and unmanaged. Look at sloppy, inefficient business processes and see how you can improve communication, consolidate storage and better facilitate excellent customer service. The cost savings are a secondary benefit and should be obvious in the face of such synnergies.

As security folks we have the potential to be the glue in these discussions, looking at ways we can protect the business. We can ensure that developers build applications using robust methodologies and guides, leverage APIs, etc. We can ensure provisions are included into contracts to enforce minimum standards of security, even influence choices of vendor and/or pricing. We have a lot more to offer the business than we often realise but it really does come down to the approach. In that respect, I think the article hit the money.

Unfortunately working in infosec is not glamorous and we get saddled doing our jobs, working with what we have rather than what we'd like to. This to me is what it is really about - making better use of what we have and looking at how we can help the business rather than hinder it.

In the future, businesses are not going to have monolithic IT shops. The future is going to involved outsourcing on a scale that you or I today can hardly conceive of. Enterprises will have all their applications, infrastructure, development - all outsourced. Other core functions will also become increasingly outsourced (I've already seen this). This enables businesses to become increasingly agile and better focus on their core competencies. IT will become more ubiquitous and pervasive than we can conceive today. Information will fly around in so many directions across so many devices that our very notions of privacy and security will be constantly redefined based on an threat landscape that will beggar belief. 

Our role in this world as security professionals will be to constantly adapt and redefine these notions on the basis of the information exchanges needed by business and perhaps more importantly, the speed in which we can do it. The world is not perfect and neither is information security in practise. But if we can help businesses make informed decision of risk, then our work is not in vain.

- J.