Monday, November 9, 2009

Mass Security Regulation Gets Tech Priorities Wrong

The final version of a sweeping new data security regulation in Massachusetts was published last week. Some parts look pretty good. But some parts look like they are straight out of 1999.

Let's start with a bit of history, for the benefit of the 99.9999% of the population that does not spend its time following obscure state-level data security regulations. The Massachusetts regulation, known as 201 CMR 17.00, was introduced a couple of years ago to address a spate of breaches of personally identifiable information. The business community balked but the regulation survived. Since then it has undergone numerous revisions to address concerns that it imposes an undue burden on businesses.

The regulation has some fairly standard and common sense requirements on the policy and procedural side. But it is on the technical level that the latest - and supposedly final - version of the regulation sounds woefully out of date. Reading through the text gives an awkward time-warp feeling. Like a newly published technical manual talking about dial-up modems and floppy discs.

That 90s feeling starts with the title of the technical section - "Computer System Requirements". Hmmm...What about all the iphones and netbooks and what not floating around the enterprise? And more critically, while securing computers is important, isn't securing servers more important? A more inclusive title like "IT Systems Requirements" would have definitely made more sense.

So what are these "computer system" requirements any how? The only purely technical requirements in the regulation talk about anti-virus software, operating system security patches, firewalls, and encryption. If you are having bad flashbacks to the CISSP you took a decade ago, that's probably not a coincidence. Those are all important issues, but are they really crucial to most technical data breaches in 2009? What about secure configurations? What about securing web applications and secure code development?

So it seems that the security-apparatchik mentality of anti-virus programs, patches, firewalls, and encryption is unfortunately alive and well in the legislative branch. And of course those measures might be the best way to secure a home computer. But they simply do not reflect the reality that most enterprise data breaches that are not a result of stupidity occur as a result of insecure configurations and applications. Up-to-date virus definitions are usually neither here nor there.

Most large companies already know this. They have an internal risk function in place that prevents them from overspending on anachronistic security measures, except when required to do so by outdated regulations like 201 CMR 17.00. But for small and medium sized businesses – including small shops that manage millions of sensitive records – regulations like 201 CMR 17.00 will drive security spending priorities. These companies are inadvertently being misled into believing that securing their environment means buying an anti-virus program and setting up auto-update.

The truth is of course very different. Installing anti-virus software is easy, but actually locking down an environment is incredibly difficult for smaller companies. That is because it requires reconfiguring other applications that no one in the organization really understands. It requires fiddling around with Unix and database permissions and PHP users in systems that no one normally touches. At the end of the day, it is hard to secure systems you do not understand, and most smaller companies do not understand the systems they run internally.

The legislation does not even begin to allude to this. From an actual risk perspective, you are better off with an out-of-date anti-virus program and a really locked down internal environment than the other way around. You are also (sometimes) better off running an unpatched operating system with few services running than a patched one with a gazillion plugins and other third party components. Whoever wrote 201 CMR 17.00 can't be expected to know this, which is why when the law gets technical it just regurgitates some old security one liners that are found in CISSP prep courses.

Interestingly, even the weak and outdated technical requirements have a get-out-of-complying free clause. The technical requirements are all prefaced with the bizarre exemption “to the extent technically feasible”. As we speak they are building some sort of a black hole I understand nothing about underneath the French-Swiss border. So of course turning on a firewall or encrypting some data is “technically feasible”. I am not a lawyer, but I cannot see how any one could make an argument that any of the requirements listed in 201 CMR 17.00 are not technically feasible. There is a very ill-defined exemption at play here that will make it difficult for companies to understand what the regulation actually requires of them.

It is a shame that the poorly written technical portion of 201 CMR 17.00 detracts from what is otherwise a well written regulation. The sections on policy, training, and contractual language are important and will prompt some companies to get their data security house in order. It is only when the regulation tries to get even vaguely technical that it falters. I do not know whether "final version" really means "final version" this time around. But if there is room for one more revision before the March 1st compliance deadline, a few words on secure configurations and applications would go a long way to improving the regulation.