The Root of The Problem

Summer in Idaho is treasured all the more since it is all too brief. We had a long, cold spring - my lilacs were two months behind those of friends and family on the east coast - and some flowers that normally do well here never did poke their colorful heads out of the ground.

My personal gardening forays have been mixed: some things I planted from seeds never came up, and others only just bloomed in August, much to my delight. I am trying to create order from chaos - more specifically, I want a lovely oasis of flowers in a rock garden I have admittedly neglected for several years. Nature abhors a vacuum and thus, she made a successful flanking maneuver to colonize flowerbeds with sagebrush and grasses. I am way beyond "yanking and weed killer" and have traded in my trowel for heavier equipment. You need a shovel and a strong back to pull up a sagebrush and as for the grass, I've had to remove the top three inches of soil in many places and move a number of rocks to get at the root system snaking under them.

I never appreciated the expression, "getting at the root of the problem" until I dealt with invasive sagebrush and "grass-zilla." I have no choice but to do it because if I do not eradicate the root system, I will continue to battle these opportunistic biological interlopers one new sprout at a time. Just as, if you do not figure out the - pun intended - root cause of a security vulnerability, but just fix the symptoms, you will later have to clean up the rest of the buggy snippets that are choking your code.

I have had professional experiences that mirror my rock garden. That is, that there are "interloping and invasive" ideas that take hold with unbelievable tenacity to the point it is hard to eradicate them. The sagebrush and grass of the cybersecurity area are what I can only call the (myth of the) evil vendor cabal (var. multae crappycodae) and supply chain risk management (malwarum hysteriensis). Both have taken hold of otherwise rational human beings just like the pods took over people's minds in Invasion of the Body Snatchers.

In the course of my work, I attend a lot of meetings, seminars and the like on software assurance. The good news is that in the last couple of years, most of the vendors who attend these events (think of the big names in software and hardware) are doing pretty much the same sort of mom and secure apple pie things in software development. The bar, I can say pretty confidently, has been raised. This does not mean industry is perfect, nor does it mean that industry is "done" improving security. I would add that all of us know that building better code is good business: good for customers and good for us. It's also important for critical infrastructure. We get it.

However, to go to some of these meetings, you wouldn't think anything had changed. I have recently been astonished at the statements of opinion - without any facts to back them up - about the state of software development and the motives of those of us who do it, and even more disturbed at what I can only describe as outright hostility to industry in particular and capitalism in general. I suspect at least part of the reason for the hostility is the self-selecting nature of some of these meetings. That is, for some assurance-focused groups, vendors only attend meetings sporadically (because it's more productive to spend time improving your product than in talking about it). That leaves the audience dominated by consultants, academics and policy makers. Each group, in its own way, wants to make the problem better and yet each, in its own way, has a vested interest in convincing other stakeholders that they - and only they - can fix the problem. Many of them have never actually built software or hardware or worked in industry - and it shows. Theory often crumbles upon the altar of actual practice.

What I have heard some of these professional theorists say is not only breathtakingly ironic but often more than a little hypocritical: for example, a tenured academic complaining that industry is "not responsive to the market." (See my earlier blog "The Supply Chain Problem") on fixing the often-execrable cybersecurity education in most university programs and the deafening silence I got in response from the universities I sent letters to.) If you are tenured, you do not have to respond to market forces: you can teach the same thing for thirty years whether or not it is what the market needs or wants and whether or not you are any good at it. (What was that again about being nonresponsive to market forces?)

I am also both amused and annoyed at the hordes of third party consultants all providing a Greek chorus of "you can't trust your suppliers - let us vet them for you." Their purpose in the drama of assurance seems to be the following:


  • Create fear, uncertainty and doubt (FUD) in the market - "evil, money-grubbing vendors can't be trusted; good, noble consultants are needed to validate security"

  • Draft standards - under contract to the government - that create new, expensive third party software and hardware validation schemes

  • Become the "validator" of software after your recommendations to the government - the ones you wrote for them - have been accepted

Could there possibly be a clearer definition of "conflict of interest" than the above? Now, I do not blame anyone for trying to create a market - isn't that what capitalism is? - but trying to create a market for your services by demonizing capitalism is hilariously ironic. One wants to know, "quis custodiet ipsos custodes?" (Who watches the watchers, otherwise known as, "why should I trust consultants who, after all, exist to sell more consulting services?")

The antibusiness rhetoric got so bad once that I took advantage of a keynote I was delivering to remark - because I am nothing if not direct - that, contrary to popular belief, there is no actual Evil Vendor Cabal wherein major software and hardware suppliers collude to determine how we can collectively:


  • build worse products

  • charge more for them and

  • put our customers at increased risk of cyberattack.


It doesn't happen. And furthermore, I added, the government likes and has benefited from buying commercial software for many applications since it is feature rich, maintained regularly, generally very configurable, and runs on a lot of operating systems. "How well," I added, "did it work when government tried to build all these systems from scratch?" The answer is, the government does not have the people or the money to do that: they never did. But the same consultants who are creating FUD about commercial software would be happy to build custom software for everything at 20 times the price, whether or not there is a reason to build custom software.

"You are all in business to make a profit!" one person stated accusingly, as if that were a bad thing. "Yes," I said, "and because we are in business to make a profit, it is very much in our interest to build robust, secure software, because it is enormously expensive for us to fix defects - especially security defects - after we ship software, and we'd much rather spend the resources on building new features we can charge for, instead of on old problems we have to fix in many, many places. Furthermore, we run our own businesses on our own software so if there is horrible security, we are the first 'customer' to suffer. And lastly, if you build buggy, crappy software that performs poorly and is expensive to maintain, you will lose customers to competitors, who love to point at your deficiencies if customers have not already found them."

The second and more disturbingly tenacious idea - and I put this in the category of grass since it seemingly will take a lot of grubbing in the dirt to eradicate it - is what is being called "supply chain risk," this year's hot boy band, judging from the amount of screaming, fainting and hysteria that surrounds it. And yet, if "it" is such a big deal, why oh why can't the people writing papers, draft standards and proposed legislation around "it" describe exactly what they are worried about? I have read multiple pieces of legislation and now, a draft NIST standard on "supply chain risk management" and still there is no clear articulation of "what are you worried about?"

I generally have a high degree of regard for the National Institute of Standards and Technology (NIST). In the past, I've even advocated to get them more money for specific projects that I thought would be a very good use of taxpayer money. I am therefore highly disturbed that a draft standard on supply chain risk management, a problem supposedly critical to our national interests, appears to be authored by contractors and not by NIST. Specifically, two out of three people who worked on the draft are consultants, not NIST employees. (Disclaimer: I know both of them professionally and I am not impugning them personally.) There is no way to know whether the NIST employee who is listed on the standard substantially contributed to the draft or merely managed a contract that "outsourced" development of it.

As I noted earlier, there is an inherent problem in having third parties who would directly stand to benefit if a "standard" is implemented participate in drafting it. Human nature being what it is, the temptation to create future business for oneself is insidiously hard to resist. Moreover, it is exceedingly difficult to resist one's own myopias about how to solve a problem and, let's face it, if you are a consultant, every problem looks like the solution is "hire a consultant." It would be exactly the same thing if, say, the federal government asked Oracle to draft a request for proposal that required a ...database. Does anybody think we could possibly be objective? Even if we tried to be open minded, the set of requirements we would come up with would look suspiciously like Oracle, because that's what we are most familiar with.

Some will argue that this is a draft standard, and will go through revisions, so the provenance of the ideas shouldn't matter. However, NIST's core mission is developing standards. If they are not capable of drafting standards themselves then they should either get the resources to do so or not do it at all. Putting it differently, if you can't perform a core mission, why are you in business? If I may be a bit cheeky here, there is a lesson from Good Old Capitalism here: you cannot be in all market segments (otherwise known as "You can't be all things to all people"). It's better to do a few things well than to try to do everything, and end up doing many things badly. I might add, any business that tried to be in too many market segments that they had no actual expertise in would fail - quickly - because the market imposes that discipline on them.

Back to the heart of the hysteria: what, precisely is meant by "supply chain risk?" At the root of all the agitation there appears to be two concerns, both of which are reasonable and legitimate to some degree. They are:


  • Counterfeiting

  • Malware


Taking the easier one first, "counterfeiting" in this context means "purchasing a piece of hardware FOO or software BAR where the product is not a bona fide FOO or BAR but a knockoff." (Note: this is not the case of buying a "solid gold Rolex" on the street corner for $10 when you know very well this is not a real Rolex - not at that price.) From the acquirer's viewpoint, the concern is that a counterfeit component will not perform as advertised (i.e., might fail at a critical juncture), or won't be supported/repaired/warranted by the manufacturer (since it is a fake product). It could also include a suspicion that instead of GoodFoo you are getting EvilKnockoffFOO, which does something very different - and malicious - from what it's supposed to do. More on that later.

From the manufacturer's standpoint, counterfeiting cuts into your revenue stream since someone is "free riding" on your brand, your advertising, maybe even your technology, and you are not getting paid for your work. Counterfeits may also damage your brand (when FakeFOO croaks under pressure instead of performing like the real product). Counterfeiting is the non-controversial part of supply chain concerns in that pretty much everybody agrees you should get what you pay for, and if you buy BigVendor's product FOO, version 5, you want to know you are actually getting FOO, version 5 (and not fake FOO). Note: I say, "non controversial," but when you have government customers buying products off eBay (deeply discounted) who are shocked - shocked I tell you! - to discover that they have bought fakes, you do want to say, "do you buy F-22s off eBay? No? Then what makes you think you can buy mission critical hardware off eBay? Buy From An Authorized Distributor, fool!"

The second area of supply chain risk hysteria is malware. Specifically, the concern that someone, somewhere will Put Something Bad in code (such as a kill switch which would render the software or hardware inoperable at a critical juncture). Without ever articulating it, the hysteria is typically that An Evil Foreigner - not a Good American Programmer - will Put Something Bad in Code. (Of course, other countries have precisely the same concern, only in their articulation, it is evil Americans who will Put Something Bad In Code.) The "foreign boogeymen" problem is at the heart of the supply chain risk hysteria and has led to the overreach of proposed solutions for it. (For example, the NIST draft wanted acquirers to be notified of changes to personnel involving "maintenance." Does this mean that every time a company hires a new developer to work on old code - and let's face it, almost everybody who works in development for an established company touches old code at some point - they have to send a letter to Uncle Sam with the name of the employee? Can you say "intrusive?")

So here is my take on the reality of the "malware" part of supply chain. It's a long explanation, and I stole it from a paper I did on supply chain issues for a group of legislators. I offer these ideas as points of clarification that I fervently hope will frame this discussion, before someone, in a burst of public service, creates an entirely new expensive, vague, "construct" of policy remedies for an unbounded problem. Back to my gardening analogy, if eradicating the roots of a plant is important and necessary to kill off a biological interloper, it is also true that some plants will not grow in all climates and in all soil no matter what you do: I cannot grow plumeria (outdoors) in Idaho no matter how hard I try and no matter how much I love it. Similarly, some of the proposed "solutions" to supply chain risk are not going to thrive because of a failure to understand what is reasonable and feasible and will "grow" and what absolutely will not. I'll go farther than that - some of the proposed remedies - and much of what is proposed in the draft NIST standard - should be dosed with weed killer.

Constraint 1: In the general case - and certainly for multi-purpose infrastructure and applications software and hardware - there are no COTS products without global development and manufacturing.

Discussion: The explosion in COTS software and hardware of the past 20 years has occurred precisely because companies are able to gain access to global talent by developing products around the world. For example, a development effort may include personnel on a single "virtual team" who work across the United States and in the United Kingdom and India. COTS suppliers also need access to global resources to support their global customers. For example, COTS suppliers often offer 7x24 support in which responsibility for addressing a critical customer service request migrates around the globe, from support center to support center (often referred to as a "follow the sun" model). Furthermore, the more effective and available (that is, 7x24 and global) support is, the more likely problems will be reported and resolved more quickly for the benefit of all customers. Even smaller firms that produce specialized COTS products (e.g., cryptographic or security software) may use global talent to produce it.

Hardware suppliers are typically no longer "soup to nuts" manufacturers. That is, a hardware supplier may use a global supply network in which components - sourced from multiple entities worldwide - are assembled by another entity. Software is loaded onto the finished hardware in yet another manufacturing step. Global manufacturing and assembly helps hardware suppliers focus on production of the elements for which they can best add value and keeps overall manufacturing and distribution costs low. We take it for granted that we can buy serviceable and powerful personal computers for under $1000, but it was not that long ago that the computing power in the average PC was out of reach for all but highly capitalized entities and special purpose applications. Global manufacturing and distribution makes this possible.

In summary, many organizations that would have deployed custom software and hardware in the past have now "bet the farm" on the use of COTS products because they are cheaper, more feature rich, and more supportable than custom software and hardware. As a result, COTS products are being embedded in many systems - or used in many deployment scenarios - that they were not necessarily designed for. Supply chain risk is by no means the only risk of deploying commercial products in non-commercial threat environments.

Constraint 2: It is not possible to prevent someone from putting something in code that is undetectable and potentially malicious, no matter how much you tighten geographic parameters.

Discussion: One of the main expressions of concern over supply chain risk is the "malware boogeyman," most often associated with the fear that a malicious employee with authorized access to code will put a backdoor or malware in code that is eventually sold to a critical infrastructure provider (e.g., financial services, utilities) or a defense or intelligence agency. Such code, it is feared, could enable an adversary to alter (i.e., change) data or exfiltrate data (e.g., remove copies of data surreptitiously) or make use of a planted "kill switch" to prevent the software or hardware from functioning. Typically, the fear is expressed as "a foreigner" could do this. However, it is unclear precisely what "foreigner" is in this context:


  • There are many H1B visa holders (and green card holders) who work for companies located in the United States. Are these "foreigners?"

  • There are US citizens who live in countries other than the US and work on code there. Are these "foreigners?" That is, is the fear of code corruption based on geography or national origin of the developer?

  • There are developers who are naturalized US citizens (or dual passport holders). Are these "foreigners?"

(Ironically, naturalized citizens and H1B visa holders are arguably more "vetted" that native-born Americans.) It is unclear whether the concern is geographic locale, national origin of a developer or overall development practice and the consistency by which it is applied worldwide.

COTS software, particularly infrastructure software (operating systems, databases, middleware) or packaged applications (customer relationship management (CRM), enterprise resource planning (ERP)) typically has multiple millions of lines of code (e.g., the Oracle database has about 70 million lines of code). Also typically, commercial software is in near-constant state of development: there is always a new version under development or old versions undergoing maintenance. While there are automated tools on the market that can scan source code for exploitable security defects (so-called static analysis tools), such tools find only a portion of exploitable defects and these are typically of the "coding error" variety. They do not find most design defects and they would be unlikely to find deliberately introduced backdoors or malware.

Given the size of COTS code bases, the fact they are in a near constant state of flux, and the limits of automated tools, there is no way to absolutely prevent the insertion of bad code that would have unintended consequences and would not be detectable. (As a proof point, a security expert in command and control systems once put "bad code" in a specific 100 lines of code and challenged code reviewers to find it within the specific 100 lines of code. They couldn't. In other words, even if you know where to look, malware can be and often is undetectable.)

Lastly, we are sticking our collective heads in the sand if we think that no American would ever put something deliberately bad in code. Most of the biggest intelligence leaks of the past were perpetrated by cleared American citizens (e.g., Aldrich Ames, Robert Hanssen and the Walker spy ring). But there are other reasons people could Do Bad Things To Code, such as being underpaid and disgruntled about it (why not stick a back door in code and threaten to shut down systems unless someone gives you a pay raise?).

Constraint 3: Commercial assurance is not "high assurance" and the commercial marketplace will not support high assurance software.

Discussion: Note that there are existing, internationally recognized assurance measures such as the Common Criteria (ISO-15408) that validate that software meets specific (stated) threats it was designed to meet. The Common Criteria supports a sliding scale of assurance (i.e., levels 1 through 7) with different levels of software development rigor required at each level: the higher the assurance level, the more development rigor required to substantiate the higher assurance level. Most commercial software can be evaluated up to Evaluation Assurance Level (EAL) 4 (which, under the Common Criteria Recognition Arrangement (CCRA), is also accepted by other countries that subscribe to the Common Criteria). Few commercial entities ask for or require "high assurance" software and few if any government customers ask for it, either.

What is achievable and commercially feasible is for a supplier to have reasonable controls on access to source code during its development cycle and reasonable use of commercial tools and processes that will find routine "bad code" (such as exploitable coding errors that lead to security vulnerabilities). Such a "raise the bar" exercise may have and likely will have a deterrent affect to the extent that it removes the plausible deniability of a malefactor inserting a common coding error that leads to a security exploit. Using automated vulnerability finding tools, in addition to improving code hygiene, makes it harder for someone to deliberately insert a backdoor masquerading as a common coding error because the tools find many such coding errors. Thus, a malefactor may, at least, have to work harder.

That said, and to Constraint 1, the COTS marketplace will not support significantly higher software assurance levels such as manual code review of 70 million lines of code, or extensive third party "validation" of large bodies of code beyond existing mechanisms (i.e., the Common Criteria) nor will it support a "custom code" development model where all developers are US citizens, any more than the marketplace will support US-only components and US-only assembly in hardware manufacturing. This was, in fact, a conclusion reached by the Defense Science Board in their report on foreign influence on the supply chain of software. And in fact, supply chain risk is not about the citizenship of developers or their geographic locale but about the lifecycle of software, how it can be corrupted, and taking reasonable and commercially feasible precautions to prevent code corruption.

Constraint 4: Any supply chain assurance exercise - whether improved assurance or improved disclosure - must be done under the auspices of a single global standard, such as the Common Criteria.

Discussion: Assurance-focused supply chain concerns should use international assurance standards (specifically the Common Criteria) to address them. Were someone to institute a separate, expensive, non-international "supply chain assurance certification," not only would software assurance not improve, it would likely get worse, because the same resources that companies today spend on improving their product would be spent on secondary or tertiary "certifications" that are expensive, inconsistent and non-leverageable. In the worst case, a firm might have to produce different products for different geographic locales, which would further divert resources (and weaken security). A new "regulatory regime" - particularly one that largely overlaps with an existing scheme - would be expensive and "crowd out" better uses of time, people, and money. To the extent some supply chain issues are not already addressed in Common Criteria evaluations, the Common Criteria could be modified to address them, using an existing structure that already speaks to assurance in the international realm.

Even in cases of "supply chain disclosure," any such disclosure requirement needs to ensure that the value of information - to purchasers - is greater than the cost to suppliers of providing such information. To that end, disclosure should be standardized, not customized. Even a large vendor would not be able to complete per-customer or per-industry questionnaires on supply chain risk for each release of each product they produce. The cost of completing such "per-customer, per-industry" questionnaires would be considerable, and far more so for small, niche vendors or innovative start-ups.

For example, a draft questionnaire developed by the Department of Homeland Security asked, for each development project, for each phase of development (requirement, design, code, and test) how many "foreigners" worked on each project? A large product may have hundreds of projects, and collating how many "foreigners" worked on each of them provides little value (and says nothing about the assurance of the software development process) while being extremely expensive to collect. (The question was dropped from the final document.)

Constraint 5: There is no defect-free or even security defect-free software.

Discussion: While better commercial software is achievable, perfect software is not. This is the case because of a combination of generally poor "security education" in universities (most developers are not taught even basic secure development practices and have to be retrained by the companies that hire them), imperfect development practices, imperfect testing practices, and the fact that new classes of vulnerabilities are being discovered (and exploited) as enemies become more sophisticated. Better security education, better development practices and better testing will improve COTS (and non-COTS) software but will not eliminate all vulnerabilities or even all security vulnerabilities -- people make mistakes, and its not possible to catch all of those mistakes.

As noted elsewhere, manual code inspection is infeasible over large code bases and is error prone. Automated vulnerability-finding tools are the only scalable solution for large code bases (to automate "error finding") but even the best commercially available automated vulnerability-finding tools find perhaps 50% of security defects in code resulting from coding errors but very few security design errors (e.g., an automated tool can't "detect" that a developer neglected to include key security functionality, like encrypting passwords or requiring a password at all).

Lastly, no commercial software ships with "zero defects." Most organizations ship production software only after a phase-in period (so-called alpha and beta testing) in which a small, select group of production customers use the software and provide feedback, and the vendor fixes the most critical defects. In other words, there is typically a "cut-off" in that less serious vulnerabilities are not fixed prior to the product being generally available to all customers.

It is reasonable and achievable that a company has enough rigor in its development practice to include, as part of a robust development practice, actively looking for security defects (using commercial automated tools), triaging them (e.g., by assigning a Common Vulnerability Scoring System (CVSS) score) and, for example, fixing all issues above a particular severity). That said, it is a certainty that some vulnerabilities will still be discovered after the product has shipped, and some of these will be security vulnerabilities.

There is a reasonableness test here we all understand. Commercial software is designed for commercial purposes and with commercial assurance levels. "Commercial software" is not necessarily military grade any more than a commercial vehicle - a Chevy Suburban, for example - is expected to perform like an M1 Abrams tank. Wanting commercial software to have been built (retroactively) using theoretically perfect but highly impractical development models (and by cleared US citizens in a secured facility, no less) might sound like Nirvana to a confluence of assurance agitators - but it is neither reasonable nor feasible and it is most emphatically not commercial software.

Book(s) of the Month

Strong Men Armed: The United States Marines vs. Japan by Robert Leckie

Robert Leckie was a Marine who served in WWII in the Pacific theater and also a prolific writer, much of it military history (another book, Helmet for My Pillow, was a basis for HBO's The Pacific). As much as I have read about the Pacific War - and I've read a lot - I continue to be inspired and humbled by the accounts of whose who fought it and what they were up against: a fanatical, ideologically-inspired and persistent foe who would happily commit suicide if he were able to take out many of "the American enemy." The Marines were on the front lines of much of that war and indeed, so many battles were the Marines' to fight and win. What I liked about this book was that it did not merely recap which battles were fought when, where and by which Marine division led by what officer, but it delves into the individuals in each battle. You know why Joe Foss received the Congressional Medal of Honor, and for what (shooting down 23 Japanese planes over Guadalcanal), for example. History is made by warriors, and everyone - not just the US Marines - should know who our heroes are. (On a personal note, I was also thrilled to read, on page 271 of my edition, several paragraphs about the exploits of Lt. Col Henry Buse, USMC, on New Britain. I later knew him as General Henry Buse, a family friend. Rest in peace, faithful warrior.)

I'm Staying with My Boys: The Heroic Life of Sgt. John Basilone, USMC by Jim Proser

One of many things to love about the US Marine Corps is that they know their heroes: any Marine knows who John Basilone is and why his name is held in honor. This book - told in the first person, unusually - is nonetheless not an autobiography but a biography of Sgt. "Manila" John Basilone, who was a recipient of the Congressional Medal of Honor for his actions at Lunga Ridge on Guadalcanal. He could have sat out the rest of the war selling war bonds but elected to return to the front, where he was killed the first day of the battle for Iwo Jima. In a world where mediocrity and the manufactured 15 minutes of fame are celebrated, this is what a real hero - and someone who is worthy of remembrance - looks like. He is reported to have said upon receiving the CMH: "Only part of this medal belongs to me. Pieces of it belong to the boys who are still on Guadalcanal. It was rough as hell down there."

The citation for John Basilone's Congressional Medal of Honor:

" For extraordinary heroism and conspicuous gallantry in action against enemy Japanese forces, above and beyond the call of duty, while serving with the 1st Battalion, 7th Marines, 1st Marine Division in the Lunga Area. Guadalcanal, Solomon Islands, on 24 and 25 October 1942. While the enemy was hammering at the Marines' defensive positions, Sgt. Basilone, in charge of 2 sections of heavy machineguns, fought valiantly to check the savage and determined assault. In a fierce frontal attack with the Japanese blasting his guns with grenades and mortar fire, one of Sgt. Basilone's sections, with its gun crews, was put out of action, leaving only 2 men able to carry on. Moving an extra gun into position, he placed it in action, then, under continual fire, repaired another and personally manned it, gallantly holding his line until replacements arrived. A little later, with ammunition critically low and the supply lines cut off, Sgt. Basilone, at great risk of his life and in the face of continued enemy attack, battled his way through hostile lines with urgently needed shells for his gunners, thereby contributing in large measure to the virtual annihilation of a Japanese regiment. His great personal valor and courageous initiative were in keeping with the highest traditions of the U.S. Naval Service."

Other Links

More than you ever wanted to know about sagebrush:

Comments:

Nice blog. On another subject: FYI My sister, Jane Knoepp Bell, referred me to your blog. She and I recently traveled to Pittsburgh to see the graves of the former Knoepps. There is a Louis Knoepp (1845-1895) who is our great great uncle and your great great great uncle, who is buried next to a 30 foot monument there with his name on it. You can go to findagrave.com and search for a grave in Pittsburgh under the surname Knoepp and click on the name "Louis Knoepp" name to see it. I am friends with your father, Bruce,who is my first cousin through Louis G Knoepp's two children, Alma Knoepp and Louis F.Knoepp. Best, Louis

Posted by Louis F Knoepp, Jr on September 05, 2010 at 11:20 AM PDT #

Good stuff. The list was shorter than I expected, but it was based in solid facts not hype. If you like fitness or martial arts stuff, check out some of my articles. Thanks!

Posted by Nathanael Willims on September 13, 2010 at 02:40 PM PDT #

this is totally off topic. in doing my family tree, I came across your name (a couple times throughout it). I'm hoping this does not get posted, but rather, that you would email me back. Are you from hawaii or, is anyone in your lineage from there? All my mother's side is davidsons which is why I ask. Sincerely, Penny Uilani Carroll Kerns jaide2@verizon.net

Posted by Penny on October 02, 2010 at 01:38 PM PDT #

Re: The Root of The Problem Excellent posting Mary Ann. Very well articulated and quite intellectual ! Oh, what a calm/soothing rational mind addressing visceral fears and worries ! This was pure fresh air. Best Regards, Rajeev

Posted by Rajeev Prabhakar on November 12, 2010 at 11:28 PM PST #

Post a Comment:
  • HTML Syntax: NOT allowed
About

bocadmin_ww

Search

Categories
Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today