Tuesday, April 27, 2010

Application Security Underfunded

Imperva and WhiteHat just came out with a report on security spending and resource allocation (registration required). This report is a must-read for anyone who is in charge of security budgets.

The basic gist of the report is that application security is not getting it's rightful share of the security spending pie. This is perhaps an unsurprising conclusion for a study sponsored by web application security vendors, but the real mystery is why the wider security industry is not talking more about this undeniable and perplexing spending imbalance in the security industry. Simply put, most threats are web based, but most security budgets are not. Why?

Here are a few reasons I can think of for the spending imbalance:
  1. Decision makers are unaware of the relative risks.
  2. Inertia
  3. Legal and regulatory requirements overlook web app security
  4. Perception that web application security cannot be solved by throwing money or resources at it.
All these factors feed into one another but there is one other factor at play that is internal to the security industry itself. By and large, the same security standards have traditionally been applied to an incredibly broad swath of companies. Rather than raising the standard for everyone, this approach has had the de facto effect of exempting certain companies from what they perceive to be irrelevant requirements. This in turn drags the entire market down to the lowest common denominator. By using the same hammer to hit all nails, the security industry has inadvertently generated a "security race to the bottom".

One Size Fits None


Some companies operate in highly regulated and highly sensitive environments where security is not up for debate. Let's call this the Fort Knox zone. In the Fort Knox zone, web application security is governed by detailed SLAs on remediating vulnerabilities and applying secure development processes. In this zone, the security of the web application is considered an inherent part of the finished product or service. Everybody thinks and breathes security. These are the big banks and the three letter agencies amongst others.

Then there's the Pragmatic zone, where security matters, but where business decisions are constantly being made to balance security against price, convenience, and functionality. Most businesses fit in the Pragmatic zone even though they might deal with sensitive data. Online health records is one example. For most people, the risk that a random hacker might find out their medical allergies pales in comparison to the risk that in an emergency a doctor might be unaware of those allergies. In the Pragmatic zone, security takes a back seat to functionality, but basic security remains highly desirable.

Finally there is the Whatever zone - a place where basically everything you use is at your own risk. This is the guy who runs a cool web service from his parents' basement that allows you to see when you and your buddies are both within stumbling distance of the same pub. In the Whatever zone, there is no guarantee - and often no mention - of security. It's not that security is trumped by other considerations. It simply was never really a consideration to begin with. And if you don't like it, don't use it.

The Failed Quest for the Esperanto of Security

Today's security industry speaks largely in the language of the Fort Knox zone. "Critical" and "severe" vulnerabilities are presented as something that must be fixed within as short a time as possible. But most businesses are actually graduates of the Whatever zone that are today in the Pragmatic zone. The shrill tone of vulnerability disclosures coupled with its frequently monolithic approach produces tone deaf customers and businesses.

In other words, the real problem is not that there are so many insecure apps out there, but rather that as an industry we set a bar that is both unattainable and inappropriate for many applications. Consider the very recently published OWASP Top 10 web application security risks. Many companies and many security folks view this list as an all-or-nothing proposition (although OWASP makes clear that it isn't). There is no inherent reason that all web applications need to be immune to all these threats. It just takes too much effort with far too little return. And this isn't even counting the opportunity cost of fixing security vulnerabilities.

The specific metrics of the WhiteHat-Imperva report underscore why the absolute approach does not work. Take for example the 38% of respondents who believe that 20 hours of developer time are needed to fix a vulnerability. Regardless of whether this figure is perception or reality, there is no way that a small operation is going to budget 20 hours to fix a seemingly obscure vulnerability when that time could be used to fix a visible bug or build a new feature. The return for spending lots of extra money to truly lock down most apps is just not there - not in the customer recognition, not in improved regulatory compliance, and often not even in a reduction in damaging security incidents (or at least not in a way that is readily measurable for organizations with limited resources).

So it shouldn't really surprise us that vulnerabilities aren't getting fixed. In most companies if the website doesn’t actually work there is hell to pay. But if there is an unfixed vulnerability almost no one knows or cares.

The User Has Spoken (while logging in over http)

This user indifference runs deep. I never cease to be amazed by the number of early and even mid-stage start-ups that don’t have login over https. From a security, or even a marketing, perspective secure login seems like a no-brainer – certificates are relatively easy to install and it is one of the few – perhaps the only – security mechanism that almost every single end user is on the lookout for. So it is very telling that many start-ups do not consider it worth investing even the slight pain-in-the-ass that using certificates introduces.

As security professionals this may seem jarring, but these start-ups know their business better than anyone else. They have figured out a well known secret of today's Internet -

Much of the Internet is pretty much useless if you follow security rules.

OK, that’s a bit harsh. But conventional security wisdom does not jibe with having fun or even getting things done online. There are just too many things you miss out on online if you actually abide by all the security rules that the purists preach. (Here's a simple list for starters - storing passwords on your iPhone, storing passwords in your browser, giving up your passwords for application integration, simultaneously logging on to numerous applications, and the list goes on).

So as an end user, you basically have a choice – seriously handicap your use of the Internet, or take your chances with a half-hearted that's-what-everyone-does attempt to minimize risk (aka anti-virus). The vast majority of end-users have opted for the latter. Or put differently, end users are fundamentally happy with the Whatever zone of security.

The low bar from home to enterprise

Most companies start operations in much the same way - in the Whatever zone of security. They need to push something out fast and get to market with the bare number of features. And the barely working mentality applies to security just as much as anything else.

It's here that the seeds of the specific spending disparity described in the Imperva-WhiteHat study first come to light. Application security comes with real project risk costs. This is in stark contrast to network security – you can secure your network layer fairly easily without risking screwing up your app. Compare the pain of setting up a WAF with the relative ease of setting up a firewall. When a small company needs to choose how to answer the security checkbox that most customers will never look beyond, the choice is clear. And so the imbalance is born - Network security 1, Application Security 0.

Of course start-ups start using the services of other start-ups, and before you know it you have a growing company within a relatively large enterprise ecosystem where everyone is using consumer-grade security without real threat analysis. It is this transition to the enterprise level where in theory the threat analysis should mature and security measures fundamentally reassessed. By that point though, the ship has gotten big and bulky and reversing course is no longer easy.

So often as companies go from the Whatever zone to the Pragmatic zone, they sweep app sec issues under the rug and hope (often correctly) that no one is going to notice or care. Today, too many enterprises are treating web application vulnerabilities as if they were still in the Whatever zone - and then if someone asks about security, they can proudly answer glad-you-asked, look-at-our-firewall. The details of the Imperva-WhiteHat report (and if you have made it this far in the post, you should really read the full report) reflect this - most security professionals report an internal culture that is either cavalier or helpless about web application vulnerabilities.

How is this going to change? If recent history is any judge, regulations and contracts will either break or re-enforce the current security spending imbalance. The current trend is towards the latter. At the risk of sounding like a broken record, I’ll mention again that even relatively recent pieces of legislation and standards (PCI, Massachusetts data security regulation, and for that matter most RFPs) completely gloss over application security. For reasons that I don't fully understand, the PCI Prioritized Approach puts most network security issues ahead of application security issues. And now Washington State has adopted a PCI-based law. This certainly doesn't bode well for correcting security spending imbalances any time soon.