Monday May 13, 2013

I Love Standards…There Are So Many Of Them

The title is not an original bon mot by me – it’s been said often, by others, and by many with more experience than I have in developing standards.  It is with mixed emotions that I feel compelled to talk about a (generally good and certainly well-intentioned) standards organization: the US National Institute of Standards and Technology (NIST). I should state at the outset that I have a lot of respect for NIST. In the past, I have even urged a Congressional committee (House Science and Technology, if memory serves) to try to allocate more money to NIST for cybersecurity standards work.  I’ve also met a number of people who work at NIST – some of whom have since left NIST and brought their considerable talents to other government agencies, one of whom I ran into recently and mentioned how I still wore a black armband years after he had left NIST because he had done such great work there and I missed working with him. All that said, I’ve seen a few trends at NIST recently that are – of concern.

When in Doubt, Hire a Consultant

I’ve talked in other blog entries about the concern I have that so much of NIST’s outwardly-visible work seems to be done not by NIST but by consultants. I’m not down on consultants for all purposes, mind you – what is having your tires rotated and your oil changed except “using a car consultant?” However, in the area of developing standards or policy guidance it is of concern, especially when, as has been the case recently, the number of consultants working on a NIST publication or draft document is greater than the number of NIST employees contributing to it.  There are business reasons, often, to use consultants. But you cannot, should not, and must not “outsource” a core mission, or why are you doing it? This is true in spades for government agencies.  Otherwise, there is an entire beltway’s worth of people just aching to tell you about a problem you didn’t know you had, propose a standard (or regulation) for it, write the standard/regulation, interpret it and “certify” that Other Entities meet it. To use a song title, “Nice Work If You Can Get It.”* Some recent consultant-heavy efforts are all over the map, perhaps because there isn’t a NIST employee to say, "you say po-TAY-to, I sy po-TAH-to, let's call the whole thing off." ** (Or at least make sure the potato standard is Idaho russet – always a good choice.)

Another explanation – not intentionally sinister but definitely a possibility – is that consultants’ business models are often tied to repeat engagements.  A short, concise, narrowly-tailored and readily understandable standard isn’t going to generate as much business for them as a long, complex and “subject to interpretation – and five people will interpret this six different ways” – document. 

In short: I really don’t like reading a document like NISTIR 7622 (more on which below) where most of the people who developed it are consultants. NIST’s core mission is standards development: NIST needs to own their core mission and not farm it out.

Son of FISMA

I have no personal experience with the Federal Information Security Management Act of 2002 (FISMA) except the amount of complaining I hear about it second hand, which is considerable.  The gist of the complaints is that FISMA asks people to do a lot of stuff that looks earnestly security oriented, not all of which is equally important.

Why should we care? To quote myself (in an obnoxiously self-referential way): “time, money and (qualified security) people are always limited.” That is, the more security degenerates into a list of the 3000 things you Must Do To Appease the Audit Gods, the less real security we will have (really, who keeps track of 3000 Must Dos, much less does them? It sounds like a demented Girl Scout merit badge). And, in fact, the one thing you read about FISMA is that many government agencies aren’t actually compliant because they missed a bunch of FISMA checkboxes.  Especially since knowledgeable resources (that is, good security people) are limited, it’s much better to do the important things well then maintain the farce that you can check 3000 boxes, which certainly cannot all be equally important. (It’s not even clear how many of these requirements contribute to actual security as opposed to supporting the No Auditor Left Behind Act.)

If the scuttlebutt I hear is accurate, the only thing that could make FISMA worse is – you guessed it –adding more checkboxes. It is thus with considerable regret that I heard recently that NIST updated NIST Special Publication 800-53 (which NIST has produced as part of its statutory responsibilities under FISMA). The Revision 4 update included more requirements in the area of supply chain risk management and software assurance and trustworthiness.  Now why would I, a maven of assurance, object to this? Because a) we already have actual standards around assurance, b) having FISMA-specific requirements means that pretty much every piece of Commercial Off-the-Shelf (COTS) software will have to be designed and built to be FISMA compliant or COTS software/hardware vendors can’t sell into the Federal government and (c) we don’t want a race by other governments to come up with competing standards, to the point where we’re checking not 3000 but 9000 or 12000 boxes and probably can’t come up with a single piece of COTS globally let alone one that meets all 12000 requirements. (Another example is the set of supply chain/assurance requirements in the telecom sector in India that include a) asking for details about country of origin and b) specific contractual terms that buyers anywhere in the supply chain are expected to use.  An unintended result is that a vendor will need to (a) disclose sensitive supply chain data (which itself may be a trade secret) and (b) modify processes around global COTS to sell into one country.)

Some of the new NIST guidance is problematic for any COTS supplier. To provide one example, consider:

“The artifacts generated by these development activities (e.g., functional specifications, high-level/low-level designs, implementation representations [source code and hardware schematics], the results from static/dynamic testing and code analysis (emphasis mine)) can provide important evidence that the information systems (including the components that compose those systems) will be more reliable and trustworthy. Security evidence can also be generated from security testing conducted by independent, accredited, third-party assessment organizations (e.g., Common Criteria Testing Laboratories (emphasis mine), Cryptographic/Security Testing Laboratories, and other assessment activities by government and private sector organizations.)”

For a start, to the extent that components are COTS, such “static testing” is certainly not going to happen by a third party nor will the results be provided to a customer. Once you allow random customers – especially governments – access to your source code or to static analysis results, you might as well gift wrap your code and send it to a country that engages in industrial espionage, because no vendor, having agreed to this for one government, will ever be able to say no to Nation States That Steal Stuff.  (And static analysis results, to the extent some vulnerabilities are not fixed yet, just provide hackers a road map for how and where to break in.) Should vendors do static analysis themselves? Sure, and many do. It’s fair for customers to ask whether this is done, and how a supplier ensures that the worst stuff is fixed before the supplier ships product. But it is worth noting – again – that if these tools were easy to use and relatively error free, everyone would be at a high level of tools usage maturity years ago. Using static analysis tools is like learning Classic Greek – very hard, indeed. (OK, koinic Greek isn’t too bad but Homeric Greek or Linear B, fuhgeddabout it.)

With reference to the Common Criteria (CC), the difficulty now is that vendors have a much harder time doing CC evaluations than in the past because of other forces narrowing CC evaluations into a small set of products that have Protection Profiles (PPs). The result has been and will be for the foreseeable future – fewer evaluated products. The National Information Assurance Partnership (NIAP) – the US evaluation scheme – has ostensibly good reasons for their “narrowed/focused” CC-directions. But it is more than a little ironic that the NIST 800-53 revision should mention CC evaluations as an assurance measure at a time when the pipeline of evaluated products is shrinking, in large part due to the directions taken by another government entity (NIAP). What is industry to make of this apparent contradiction? Besides corporate head scratching, that is.

There are other – many other sections – I could comment upon, but one sticks out as worthy of notice:

“Supply chain risk is part of the advanced persistent threat (APT).”

It’s bad enough that “supply chain risk” is such a vague term that it encompasses basically any and all risk of buying from a third party. (Including “buying a crummy product” which is not actually a supply chain-specific risk but a risk of buying any and all products.) Can bad guys try to corrupt the supply chain? Sure. Does that make any and all supply chain risks “part of APT?” Heck, no. We have enough hysteria about supply chain risk and APT without linking them together for Super-Hysteria. 

To sum up, I don’t disagree that customers in some cases – and for some, not all applications – may wish higher levels of assurance or have a heightened awareness of cyber-specific supply chain threats (e.g., counterfeiting and deliberate insertion of malware in code). However, incorporation of supply chain provisions and assurance requirements into NIST 800-53 has the unintended effect of requiring any and all COTS products to be sold to government agencies – which is all of them as far as I know – to be subject to FISMA.

What if the state of Idaho decided that every piece of software had to attest to the fact that No Actual Moose were harmed during the production of this software and that any moose used in code production all had background checks? What if every other state enumerated specific assurance requirements and specific supply chain risk management practices? What if they conflict with each other, or with the NIST 800-53 requirements? I mean really, why are these specific requirements called out in NIST 800-53 at all? There really aren’t that many ways to build good software.  FISMA as interpreted by NIST 800-53 really, really shouldn’t roll its own.

IT Came from Outer Space – NISTIR 7622

I’ve already opined at length about how bad the NIST Interagency Report (NISTIR) 7622 is. I had 30 pages of comments on the first 80-page draft. The second draft only allowed comments of the Excel Spreadsheet form:  “Section A.b, change ‘must’ to ‘should,’ for the reason ‘because ‘must’ is impossible’” and so on. This format didn’t allow for wholesale comments such as “it’s unclear what problem this section is trying to solve and represents overreach, fuzzy definition and fuzzier thinking.”  NISTIR 7622 was and is so dreadful that an industry association signed a letter that said, in effect, NISTIR 7622 was not salvageable, couldn’t be edited to something that could work, and needed to be scrapped in toto.

I have used NISTIR 7622 multiple times as a negative example: most recently, to an audience of security practitioners as to why they need to be aware of what regulations are coming down the pike and speak up early and often.  I also used it in the context of a (humorous) paper I did at the recent RSA Conference with a colleague, the subject of which was described as “doubtless-well-intentioned legislation/regulation-that-has-seriously-unfortunate-yet-doubtless-unintended-consequences.”  That’s about as tactful as you can get.

Alas, Dracula does rise from the grave,*** because I thought I heard noises at a recent Department of Homeland Security event that NISTIR 7622 was going to move beyond “good advice” and morph into a special publication. (“Run for your lives, store up garlic and don’t go out after dark without a cross!”)  The current version of NISTIR 7622 – after two rounds of edits and heaven knows how many thousands of hours of scrutiny – is still unworkable, overscoped and completely unclear: you have a better chance of reading Linear B**** than understanding this document (and for those who don’t already know, Linear B is not merely “all Greek to me” – it’s actually all Greek to anybody).  Ergo, NISTIR 7622 needs to die the true death: the last thing anyone should do with it is make a special publication out of it. It’s doubling down on dreck. Make it stop. Now. Please.


The last section is, to be fair, not really about NIST per se. NIST has been tasked, by virtue of a recent White House Executive Order, with developing a framework for improving cybersecurity. As part of that tasking, NIST has published a Request For Information (RFI) seeking industry input on said framework. NIST has also scheduled several meetings to actively draw in thoughts and comments from those outside NIST. As a general rule, and NISTIR 7622 notwithstanding, NIST is very good at eliciting and incorporating feedback from a broad swath of stakeholders. It’s one of their strengths and one of the things I like about them. More importantly, I give major kudos to NIST and its Director Pat Gallagher for forcefully making the point that NIST would not interfere with IT design, development and manufacture, in the speech he gave when he kicked off NIST’s work on the Framework: “the Framework must be technology neutral and it must enable critical infrastructure sectors to benefit from a competitive [technology] market. (…) In other words, we will not be seeking to tell industry how to build your products or how to run your business.”

The RFI responses are posted publicly and are, well, all over the map.  What is concerning to me is the apparent desire of some respondents to have the government tell industry how to run their businesses. More specifically, how to build software, how to manage supply chain risk, and so forth. No, no, and no. (Maybe some of the respondents are consultants lobbying the government to require businesses to hire these consultants to comply with this or that mandate.)

For one thing, “security by design” concepts have already been working their way into development for a number of years: many companies are now staking their reputations on the security of their products and services. Market forces are working. Also, it’s a good time to remind people that more transparency is reasonable – for example, to enable purchasers to make better risk-based acquisition decisions – but when you buy COTS you don’t get to tell the provider how to build it. That’s called “custom code” or “custom development.” Just like, I don’t get to walk into <insert name of low-end clothing retailer here> and tell them, that I expect my “standard off-the-shelf blue jeans” to ex post facto be tailored to me specifically, made of “organic, local and sustainable cotton” (leaving aside the fact that nobody grows cotton in Idaho), oh, and embroidered with not merely rhinestones but diamonds.  The retailer’s response should be “pound sand/good luck with that.” It’s one thing to ask your vendor “tell me what you did to build security into this product” and “tell me how you help mitigate counterfeiting” but something else for a non-manufacturing entity – the government – to dictate exactly how industry should build products and manage risk. Do we really want the government telling industry how to build products? Further, do we really want a US-specific set of requirements for how to build products for a global marketplace? What’s good for the (US) goose is good for the (European/Brazilian/Chinese/Russian/Indian/Korean/name your foreign country) gander.

An illustrative set of published responses to the NIST RFI – and my response to the response – follows:

1. “NIST should likewise recognize that Information Technology (IT) products and services play a critical role in addressing cybersecurity vulnerabilities, and their exclusion from the Framework will leave many critical issues unaddressed.”

Comment: COTS is general purpose software and not built for all threat environments. If I take my regular old longboard and attempt to surf Maverick’s on a 30 foot day and “eat it,” as I surely will, not merely because of my lack of preparation for 30-foot waves but because you need, as every surfer knows, a “rhino chaser” or “elephant gun” board for those conditions, is it the longboard shaper’s fault? Heck, no. No surfboard is designed for all surf conditions; neither is COTS designed for all threat environments. Are we going to insist on products designed for one-size-fits-all threat conditions? If so, we will all, collectively, “wipe out.” (Can’t surf small waves well on a rhino chaser. Can’t walk the board on one, either.)

Nobody agrees on what, precisely, constitutes critical infrastructure. Believe it or not, some governments appear to believe that social media should be part of critical national infrastructure. (Clearly, the World As We Know It will come to an end if I can’t post a picture of my dog Koa on Facebook.) And even if certain critical infrastructure functions – say, power generation – depend on COTS hardware and software, the surest way to weaken their security is to apply an inflexible and country-specific regulatory framework to that COTS hardware and software. We have an existing standard for the evaluation of COTS IT, it’s called the Common Criteria (see below): let’s use it rather than reinvent the digital wheel.  

2. “Software that is purchased or built by critical infrastructure operators should have a reasonable protective measures applied during the software development process.”

Comment: Thus introducing an entirely new and undefined term into the assurance lexicon: “protective measures.” I’ve worked in security – actually, the security of product development – for 20 years and I have no idea what this means. Does it mean that every product should self defend? I confess, I rather like the idea of applying the Marine Corps ethos – “every Marine a rifleman” – to commercial software. Every product should understand when it is under attack and every product should self-defend. It is a great concept but we do not, as an industry, know how to do that - yet. Does “protective measures” mean “quality measures?” Does it mean “standard assurance measures?” Nobody knows. And any term that is this nebulous will be interpreted by every reader as Something Different. 

3. “Ultimately, <Company X> believes that the public-private establishment of baseline security assurance standards for the ICT industry should cover all key components of the end-to-end lifecycle of ICT products, including R&D, product development, procurement, supply chain, pre-installation product evaluation, and trusted delivery/installation, and post-installation updates and servicing.”

Comment:  I can see the religious wars over tip-of-tree vs. waterfall vs. agile development methodologies. There is no single development methodology, there is no single set of assurance practices that will work for every organization (for goodness’ sake, you can’t even find a single vulnerability analysis tool that works well against all code bases).

Too many in government and industry cannot express concerns or problem statements in simple, declarative sentences, if at all. They don’t, therefore, have any business attempting to standardize how all commercial products are built (what problem will this solve, exactly?). Also, if there is an argument for baseline assurance requirements, it certainly can’t be for everything, or are we arguing that “” is critical infrastructure and need to be built to withstand hostile nation state attacks that attempt to steal your brioche recipe if not tips on how to get sugar to caramelize at altitude?

 4. “Application of this technique to the Common Criteria for Information Technology Security Evaluation revealed a number of defects in that standard.  The journal Information and Software Technology will soon publish an article describing our technique and some of the defects we found in the Common Criteria.”

Comment: Nobody ever claimed the Common Criteria was perfect. What it does have going for it is a) it’s an ISO standard and b) by virtue of the Common Criteria Recognition Arrangement (CCRA), evaluating once against the Common Criteria gains you recognition in 20-some other countries. Putting it differently, the quickest way to make security much, much worse is to have a Balkanization of assurance requirements. (Taking a horse and jumping through mauve, pink, and yellow hoops doesn’t make the horse any better, but it does enrich the hoop manufacturers, quite nicely.)  In the security realm, doing the same thing four times doesn’t give you four times the security, it reduces security by four times, as limited (skilled) resource goes to doing the same thing four different ways. If we want better security, improve the Common Criteria and, by the way, major IT vendors and the Common Criteria national schemes – which come from each CCRA member country’s information assurance agency, like the NSA in the US – have been hard at work for the last few years applying their considerable security expertise and resources to do just that. Having state-by-state or country-by-country assurance requirements will make security worse – much, much worse.

 5. “…vendor adoption of industry standard security models.  In addition, we also believe that initiatives to motivate vendors to more uniformly adopt vulnerability and log data categorization, reporting and detection automation ecosystems will be a significant step in ensuring security tools can better detect, report and repair security vulnerabilities.”

Comment: There are so many flaws in this, one hardly knows where to start. There are existing vulnerability “scoring” standards – namely, the Common Vulnerability Scoring System (CVSS), *****  though there are some challenges with it, such as the fact that the value of the data compromised should make a difference in the score: a “breach” of Aunt Gertrude’s Whiskey Sauce Recipe is not, ceteris paribus, as dire as a breach of Personally Identifiable Information (PII) if for no other reason than a company can incur large fines for the latter, far exceeding Aunt Gertrude’s displeasure at the former. Even if she cuts you out of her will.

Also, there is work going on to standardize descriptions of product vulnerabilities (that is, the format and type). However, not all vendors release the exact same amount of information when they announce security vulnerabilities and should not be required to.  Oracle believes that it is not necessary to release either exploit code or the exact type of vulnerability; e.g., buffer overflow, cross-site request forgery (CSRF) or the like because this information does not help customers decide whether to apply a patch or not: it merely enables hackers to break into things faster. Standardize how you refer to particular advisory bulletin elements and make them machine readable? Sure. Insist on dictating business practices (e.g., how much information to release) – heck, no. That’s between a vendor and its customer base. Lastly, security tools cannot, in general “repair” security vulnerabilities – typically, only patch application can do that.

6. “All owners and operators of critical infrastructure face risk from the supply chain. Purchasing hardware and software potentially introduce security risk into the organization. Creation of a voluntary vendor certification program may help drive innovation and better security in the components that are essential to delivery of critical infrastructure services.”

Comment:  The insanity of the following comment astounds: “Purchasing hardware and software potentially introduce security risk into the organization.” News flash: all business involves “risk.” Not doing something is a risk. So, what else is new? Actually, attempting to build everything yourself also involves risk – not being able to find qualified people, the cost (and ability) to maintain a home-grown solution, and so forth. To quote myself again: “Only God created something from nothing: everyone else has a supply chain.”****** In short, everyone purchases something from outside their own organization. Making all purchases into A Supply Chain Risk as opposed to, say, a normal business risk is silly and counterproductive.  It also makes it far less likely that specific, targeted supply chain threats can be addressed at all if “buying something – anything – is a risk” is the threat definition.

At this point, I think I’ve said enough. Maybe too much. Again, I appreciate NIST as an organization and as I said above the direction they have set for the Framework (not to $%*& with IT innovation) is really to their credit. I believe NIST needs to in-source more of their standards/policy development, because it is their core mission and because consultants have every incentive to create perpetual work for themselves (and none whatsoever to be precise and focused). NIST should adopt a less-is-more mantra vis-a-vis security. It is better to ask organizations do a few critical things well than to ask them to do absolutely everything – with not enough resource (which is a collective industry problem and not one likely to be solved any time soon). Lastly, we need to remember that we are a proud nation of innovators. Governments generally don’t do well when they tell industry how to do their core mission – innovate – and, absent a truly compelling public policy argument for so doing, they shouldn’t try. 

*”Nice Work If You Can Get It,” lyrics by Ira Gershwin, music  by George Gershwin. Don’t you just love Gershwin?

** “Let’s Call The Whole Thing Off.” Another gem by George and Ira Gershwin.

*** Which reminds me – I really hate the expression “there are no silver bullets.” Of course there are silver bullets. How many vampires and werewolves do you see wandering around?

****Speaking of which, I just finished a fascinating if short read: The Man Who Deciphered Linear B: The Story of Michael Ventris.

*****CVSS is undergoing revision.

****** If you believe the account in Genesis, that is.




  • Oracle
« May 2013 »