The Supply Chain Problem
By user701213 on Apr 07, 2008
I recently participated in a Defense Science Board study that examined foreign influence over the supply chain of software. The study noted that, even as vendors need worldwide access to technological talent to enable them to create commercial software solutions benefiting the US Department of Defense, there is an increased risk that the supply chain of software may be compromised by adversaries, such as hostile nation states. Working on that task force brought supply chain issues front and center in my thinking for a number of months.
Supply chain security issues are on many people's minds these days. More and more regulations impact IT operations either directly or indirectly, such as the Sarbanes-Oxley Act, the Gramm-Leach-Bliley Act, the Health Insurance Portability and Accountability Act (HIPAA), various breach disclosure laws such as California SB1386, and information security laws like Minnesota's adoption of some of the payment card industry (PCI) standards. (And these are just the US laws.) Customers are being pressured to establish (from documentation to demonstration) they are "more secure" and are in turn pressuring their supply chain - software vendors - to prove that the enterprise software they provide is secure. Vendors are being asked everything from "What features and functions do you have to help meet regulatory requirements?" to "How do you embed security within your software development lifecycle?" This is a good thing, and how markets are supposed to work.
In the vendor community, there is a low rumble of discontent about our supply chain's current lack of a "secure development lifecycle." I'm not talking about other software suppliers (for example, vendors who supply toolkits or components we embed) though at Oracle, we do vet these suppliers' security practices before we incorporate their technologies into our code.
What I mean by supply chain is the universities who supply CS graduates to IT vendors. There is no "secure development lifecycle" in the vast majority of universities' degree programs - that is, security is not "baked into" graduates of relevant programs (e.g., computer science) throughout their degree programs. And that is a problem, perhaps the problem plaguing the software industry. All the other security remediation taking place in the software supply chain (such as multiple security point solutions, vulnerability analysis services, and patch management offerings) largely stems from the fact that most software was neither designed nor built to be secure. And to that point, developers don't code software from the perspective of an attacker. Many believe security is a task for someone else ("it's behind the firewall so we don't have to worry!"); but their code is a target and will only be more of one in the future.
CS majors graduate from long, labor-intensive degree programs without, in most cases, knowing even first principles of secure coding and secure engineering practice. They are not stupid, but ignorant. They aren't being taught secure development practice because in many cases, their professors do not know it, or do not know the material well enough to teach it, or do not view it as a priority; I've heard a number of professors admit as much. Also, many professors are tenured and thus non-responsive to market forces. They don't have to change because they have the ultimate job security, which means that many can continue to teach Buggy Whip Design 101 instead of moving into the 21st century. I can say this because I spent the first 18 years of my life living in university towns: my dad was a department chair, associate dean, and then dean of the faculty. I think "tenured" was one of the first words I learned to spell.
Last year, I got fed up enough with Oracle having to train otherwise bright and capable CS grads in secure coding 101 that I sent letters to the top 10 or so universities we recruit from (my boss came up with the idea and someone on my team executed on it - teamwork is a wonderful thing). Specifically, we sent the letters to the chairmen of the department of computer science (or equivalent) and copied the deans of the schools with oversight of the CS departments. In the letter, we stated that Oracle expends significant resources training CS graduates in secure coding practices. We described the impact to us and to our customers of avoidable, preventable security defects, and why the insecurity of commercial software is a national security problem. We also pointed out that SANS has developed an assessment for secure coding practice. And we stated that in the future, Oracle would give preference in hiring to those universities that emphasize secure coding practices.
I am sorry to state that only one of those universities we wrote to responded to my letter (specifically, one department chairman responded), and the one that did (while stating that they did have courseware pertaining to security practice) wanted funding from Oracle to develop a more robust class. Having grown up at universities, I know very well that universities as a group tend to be really well endowed. In English, this means they have all the money in the world to do things like "teach better" except that as a group, professors' fortunes rise or fall with getting money to Do More Research (quite often, much of which has already been done before, or better).
While I appreciate the University of X's CS department chairman getting back to me (and the fact that they had at least some material on secure coding practice), I see no reason to pay them to do work they should be doing, anyway. In particular, paying a university to develop a class on secure coding that only they teach is not going to solve this problem. Nor - despite excellent intentions - are the NSA's Centers of Excellence in Information Assurance going to solve the problem.
We need a revolution - an upending of the way we think about security -and that means upsetting the supply chain of software developers. I suppose I am revolutionary-minded because I am finishing reading a book on the American Revolution (Liberty, by Thomas Fleming), but there is a point beyond which tinkering with existing structures of government is not enough. There is a principle at stake (like "taxation without representation is tyranny"). If the powers that be don't grasp the principle, the only choice is to "secede." Maybe the principle that I want universities to grasp is the one the Marine Corps has: "Every Marine is a rifleman." Every Marine can fight - they don't "outsource" rifle handling to others if they are attacked. (Imagine how different the IT space would be if every developer thought and coded defensively and every product could self-defend. I bet the average Marine gunny sergeant could whip universities into shape in about 16 weeks or less.)
Like some of the publications circulated by the Sons of Liberty in the buildup to the American Revolution, I found my "letter to universities" idea struck a responsive chord. A fellow vendor asked for a copy of the letter. Someone in a quasi-government organization (who was keenly interested in the assurance problem) wanted a copy of the letter to go back to universities to prove to them that their "customers" needed them to change. Two people armed with my letter is a start, but it's not enough to start a revolution.
Forthwith, I have taken the liberty (after expunging the name of the university to which it was originally addressed) of PDFing one of my letters to universities from last year, and publishing it on the Oracle web site at: http://www.oracle.com/security/docs/mary-ann-letter.pdf
In so doing, I consider this to be both an open letter to my fellow vendors, and an open letter to universities.
To the vendor community, just as customers are demanding more of us in security (and rightly so), we must demand more of our suppliers. It is inefficient and wasteful for each of us to train CS graduates in secure coding practice - yet Oracle and many other vendors expect secure coding practice as part of our development processes (and if we aren't doing that, then we need to do it). More to the point, the cultural transformation - that CS graduates are responsible for the security and safety of the code they write - must happen in universities. Take my letter, modify it as you will, and start holding university CS programs' feet to the fire to improve. To quote Ben Franklin after signing the Declaration of Independence: "We must all hang together, or most assuredly we will all hang separately."
Also, vendors, if you have secure coding class material, work with the organizations that are trying to fix the problem. SANS, for example, is working on material for faculty members to use in teaching secure coding practice (Oracle is participating in this). The Department of Homeland Security's Software Assurance Forum (next meeting in early May) has people working on a Common Body of Software Knowledge, as well as other training work. As I write this, I am working through the details of getting a tutorial Oracle developed on SQL injection prevention released to universities gratis. Those who have done it tell me that if you make secure coding courseware available, at least some CS professors will teach it.
Vendors can also express their concerns to the Association for Computing Machinery (ACM) - the accreditation body for CS degree programs. (Mahalo nui loa to Scott Charney of Microsoft, who did just that a couple of years ago and got a number of us in industry to sign the letter.) I note that the sooner we can get to a basic secure coding class everyone can use (phase 1), the harder it will be for ACM to refuse to change their accreditation program, especially if enough vendors complain to them. Let's make it easy to say "yes" and hard to say "no."
To universities, I cannot but contrast the education of engineers with that of computer science majors. Engineers know that their work product must above all be safe, secure and reliable. They are trained to think this way (not pawn off "safety" on "testers") and their curricula builds and reinforces the techniques and mindset of safe, secure and reliable product. (A civil engineer who ignores the principles of basic structures - a core course - in an upper level class is not going to graduate, and can't dismiss structures as a "legacy problem.")
Universities, you must start with a basic secure coding/secure development practice class that is required for all CS majors.* You must then revamp the fabric of every single class so that security becomes part and parcel of each class. If a student's "elegant technical solution" in an upper level class is hackable, the student shouldn't get a great grade: in fact, maybe hackable homework should be grounds for failure - kind of like a bridge design that would collapse under loading would get a failing grade in the Civil Engineering Department. I knew a professor at Stanford who routinely had his students "red team" and "blue team" each other's homework (and his class wasn't even a security class). I'd thank him if I could remember his name. Secure development practice needs to be embedded within the fabric of every class, not just in a single class that students file and forget.
Universities, think more broadly about the application of security to your classes. (I have learned more about this problem just since I sent the original letters.) For example, think about all the process engineers designing control systems for pharmaceutical companies, chemical plants, utilities, and more. Do you think that security is embedded within the fabric of each and every course that they take? No, it isn't. (True and scary story from a colleague about a guy who insisted that his PC - which had a control system interface on it - was not Internet accessible. Oh really, what is that instant messaging window doing open on your desktop?)
I also offer a personal anecdote about the difference between "taking a class" and immersing yourself in a language in support of my argument. Many readers (well, the 5 people who read my blog regularly, which includes my parents) know that I love the Hawaiian language. Something delightful happened when I moved beyond reading the Hawaiian language textbook and started making Hawaiian part of my daily life. I read the Bible in Hawaiian instead of English. I read Hawaiian-language books (like the story of Kamapua'a, the Hawaiian pig-god, and the Kumulipo - the Hawaiian creation chant). I sing along to Hawaiian songs. I found that once I moved beyond "conversational exercises" and immersed more of my life in the language, I started thinking in Hawaiian. (For example, I can form a sentence without stopping to think, "does that noun take an a-form possessive or an o-form possessive?"**) Immersion in a subject or language works because it changes the way you think. Single classes do not work - at least, they don't work if you want to develop fluency or change your mindset.
I am hopeful that working together, vendors and universities can help create a revolution from within, for the benefit of all.
If change is slow to happen, or there is resistance to change, vendors can also help create an impetus behind this effort by going to legislators - such as those who serve on the House of Representatives Science and Technology Committee - and ask them to consider tying research money (for example, funds dispensed through the National Science Foundation (NSF)) to computer science curricula reform. Perhaps universities' CS departments would have the time and motivation to fix their curricula if they weren't (and I am not making this up) wasting time and grant money on how to wave a cell phone in front of a professor's door to get access to the room. If all else fails, "money talks." The power of the purse can effect positive change (ask any kid whose allowance is withheld until he learns to clean up his messy room).
Since I am on a history kick anyway, I should point out that the US Federal Government has had a significant role in the development of the software industry. The government, especially the Defense Department, successfully used the "power of the purse" to rapidly develop the computer industry in its early stages, and can continue to use its positive influence to change the way universities develop curricula. So anybody who thinks that the entity handing out money (the government) shouldn't help use that lever to help make us more secure (by insisting that universities they fund fix a root cause of IT insecurity) needs a history refresher.
Universities are not evil but they are generally not responsive to market forces, due to a) an endless source of research money often not tied to anything approaching pragmatic results and b) tenured faculty that do not have to change because there is no impetus to change nor penalties if they don't change. We as vendors should help them change through both the "carrot" of donating our time, expertise and support for changing the curricula, so that relevant degree programs have the "secure development lifecycle" in producing graduates that we as vendors are expected to have as suppliers, and the "stick" of using accreditation and funding (or funding cutoff) to help force needed change. When Great Britain refused to accede to the principle of "taxation without representation is tyranny," the colonies seceded. We did not get our independence from Great Britain by asking more nicely for it.
Our world is more technologically based than ever before. All customers rely on IT as infrastructure, and are being driven by regulation to insist on a "secure software supply chain." Producing secure software does indeed require a secure supply chain, not limited to but including university graduates whose computer-related degree programs have security principles and practices embedded within every element of their degree programs. Perhaps what I have said above is harsh, but I offer it as Tough Love. We simply - and collectively - must evolve to defensive mindsets delivering defensible code lest none of us survive in a hostile world.
"We must all hang together, or most assuredly we will all hang separately."
Disclaimer: Large portions of the above blog were originally written for an Oracle Magazine column I do regularly, "All Secure." The elegant journalistic term for "self-plagiarism" is "repurposing," and anyway, it's not plagiarism if you steal from yourself.
* I'd be remiss in not mentioning a few (among many) bright spots working on the supply chain problem at the university end: Gene Spafford at Purdue (always on anyone's bright spot list and has been for years), Samuel Redwine at James Madison University (who has labored long and mightily on a software security body of knowledge), and Neil Daswani at Stanford (who has published a book Foundations of Security: What Every Programmer Needs To Know available at http://tinyurl.com/33xs6g and who graciously sought me out to give me a copy). I am barely giving these fine gentleman credit for a lot of hard work to improve university curricula in this area, and I know there are others who are also similarly engaged whom I have not credited. Thank you, all.
** If you really want to know, o-form possessives are used for things that are inalienable or are your birthright. Emotions, for example (like aloha - love), means of conveyance (like papa he'e nalu - surfboard), parents, gods, are all inalienable and thus take an o-form possessive: He makuahine maika'i ko'u. (I have a good mother.) Things that are alienable or that you acquire (spouse, children) take a-form possessives: He ipo 'olu'olu ka'u. (I have a nice sweetheart.) It was a big day in my life when I could start rattling off sentences without thinking about what kind of possessive to use.
For More Information:
Book of the Week:
Aircraft Carriers at War: A Personal Retrospective of Korea, Vietnam, and the Soviet Confrontation By Admiral James L. Holloway III, USN (Ret.)
ADM Holloway (disclaimer: a family friend, so I am justifiably prejudiced in his favor) has had an amazing career: an officer during WWII (present at the Battle of Surigao Straight outlined in Last Stand of the Tin Can Sailors) he then qualified as a naval aviator, serving throughout the Korean and Vietnam Wars. He also served as Chief of Naval Operations. He is fine leader, a fine person and a long time contributor to naval history and thought. There has been so little written about the Cold War from a military perspective that this book is doubly welcome: written by a great leader and warrior who was there. (Hey, all the reviews are glowing - I am just gilding the lily.)
Another true hero has died: Jacob DeShazer, who was one of the Doolittle Raiders who "struck back" at Japan after Pearl Harbor by bombing Tokyo on April 18, 1942. (Japan subsequently decided to "finish" the Pacific fleet at Midway, where they lost the war.) DeShazer endured unbelievable hardships - torture and deprivation - as a POW of the Japanese but forgave his captors after becoming a Christian, and returned to Japan to serve as a missionary for 30-odd years. Rest in peace, faithful warrior.
The Defense Science Board Task Force Report on Mission Impact of Foreign Influence on DoD Software:
Web site for the House Science and Technology Committee (express yourself!):
The educational board of ACM (complain to them!) can be found at:
More on the Hawaiian language (including a-form and o-form possessives):
The SQL injection tutorial I mentioned (anyone can take it):
Last - but far from least - the SANS organization web site: