Obamacare and Healthcare.gov: How We Got Here

December 2, 2013 | By | 1 Comment

complex

Stephen Covey was fond of saying, “You teach what you are.” Regardless of what platitudes you speak, all you really teach is what you actually live and practice. It should not be surprising, then, that the same political and philosophical mindset that produced the Affordable Care Act and pushed it through Congress in spite of fierce opposition and public disapproval would then seek to implement it with a vast, complex IT system that reflects much of the same hubris and many of the same fundamental flaws.

Back in 2009, I downloaded and read (I didn’t say comprehended, merely said read) the entirety of HR 3200, which would ultimately evolve into the Affordable Care Act, aka Obamacare. I then wrote two posts analyzing it from a systems design perspective. I have gone back (and linked back) to these posts a few times in the past few months, because they not only contains lots of predictions that have come true over the past 4 years, they also anticipate the core technical problems in the Healthcare.gov implementation project. In other words, the Healthcare.gov project is struggling for many of the same reasons that Obamacare as a Federal program is struggling.

The first post talks about the overall incomprehensibility of the actual text of HR 3200, while the second post talks about the conceptual flaws in such a broad “big-bang” approach to healthcare reform. Below, I have combined both posts into one, with some minor editing, then appended an afterword.

HR 3200 [aka Obamacare] from a systems design perspective

[start of combined posts from 2009]

On the occasions where I have reviewed the actual text of major legislation, I have been struck by the parallels between legislation and software, particularly in terms of the pitfalls and issues with architecture, design, implementation, testing, and deployment. Some of the tradeoffs are even the same, such as trading off the risk of “analysis paralysis” (never moving beyond the research and analysis phase) and the risks of unintended consequences from rushing ill-formed software into production. Yet another similarity is that both software and legislation tend to leverage off of, interact with, call upon, extend, and/or replace existing software and legislation.  Finally, the more complex a given system or piece of legislation is, the less likely that it will achieve the original intent.

But there are some critical differences that make legislation design both harder and higher-risk than systems design.

Software vs. Legislation

First, software is designed for a target or reference system; you can in theory predict or constrain its behavior, and its behavior is largely repeatable.

Legislation, by contrast, is executed by humans, with wide latitude for interpretation and implementation, as well as misunderstandings, disagreements on meaning, and on-the-fly modifications.

Second, software typically has several layers of independent (non-human) syntactic, semantic, and integration checking that it of necessity goes through before deployment (though plenty of defects can and do  slip through).

Legislation, by contrast, is written in a natural (human) language, with all its gaps, faults, and ambiguities, and with nothing to force error checking in syntax, semantics, and integration; there’s no way of “compiling“, “linking” and doing a test run of the legislation in a limited environment before it becomes the (largely irrevocable) law of the land.

Third, because of the previous two factors, two or more software engineers can typically reach professional agreement on what a given section of source code will do; if they continue to disagree, there are standard tools and methods by which they can objectively demonstrate how the software will behave, either exactly or within general limits.

By contrast, and due to the corresponding factors with legislation, two or more people (legislators, executives, judges, and citizens) can interpret a given section of legislation quite differently, and each may well have a defensible position, due to the potentially wide latitude of and arena for interpretation.

Some Design Flaws of HR 3200

This all comes to mind as I have been reviewing HR 3200, aka the House bill on health care reform. While I am neither a legislator nor a lawyer (though I have worked closely with lawyers for a decade), I am a professional software architect/engineer, and a professional writer, who has worked in the IT field for 35 years. From that point of view, I believe HR 3200 will exhibit profound problems and unintended (or unclaimed) consequences if passed. Here are some of reasons why.

To begin with, HR 3200 suffers from all the problems listed above with legislation. It is written in English, and complex, obscure, jargon-laden English at that. Many of the sections are imprecise and/or incomplete, leaving large amounts of interpretation and implementation to unelected humans. Many of the objections to HR 3200 come from this very problem, including the concern that the ambiguity is deliberate and intended to open doors to politically unpalatable consequences.

HR 3200 is also massive and very complex — over 1000 pages in printed form, with hundreds of sections. For its sheer length alone, it is difficult to understand and interpret, but (as indicated below) there are other factors that make overall comprehension nearly impossible. It also makes after-the-fact revocation or even modification extremely difficult.

Much of HR 3200 makes piecemeal modifications to existing legislation, often with little explanation as to intent and consequences. So, for example:

SEC. 1148. DURABLE MEDICAL EQUIPMENT PROGRAM IMPROVEMENTS.

(c) Treatment of Current Accreditation Applications- Section 1834(a)(20)(F) of such Act (42 U.S.C. 1395m(a)(20)(F)) is amended–

(1) in clause (i)–

(A) by striking ‘clause (ii)’ and inserting ‘clauses (ii) and (iii)’;

(B) by striking ‘and’ at the end;

(2) by striking the period at the end of clause (ii)(II) and by inserting ‘; and’; and

(3) by adding at the end the following:

‘(iii) the requirement for accreditation described in clause (i) shall not apply for purposes of supplying diabetic testing supplies, canes, and crutches in the case of a pharmacy that is enrolled under section 1866(j) as a supplier of durable medical equipment, prosthetics, orthotics, and supplies.

Any supplier that has submitted an application for accreditation before August 1, 2009, shall be deemed as meeting applicable standards and accreditation requirement under this subparagraph until such time as the independent accreditation organization takes action on the supplier’s application.’

This happens repeatedly throughout HR 3200; in fact, one entire portion (Division A, Title IV) is labeled “AMENDMENTS TO INTERNAL REVENUE CODE OF 1986?. This makes it difficult — beyond the ambiguities of the language itself — to determine just what is being modified and what the potential implications are.

HR 3200 also suffers in places from what a software engineer would call “spaghetti coding“. In other words, a given section within HR 3200 (and there appear to be hundreds of them; numbers go from 100 up through 2531 and appear in numeric order, but there are many gaps along the way) will reference several other sections elsewhere in HR 3200, both above and below. Furthermore, it often requires careful reading going back pages to see whether a reference to a given section is to a section within HR 3200 itself or a section in existing legislation (such as the Internal Revenue Service code).

HR 3200 also comes across as similar to a “kitchen sink” application, that is, a single piece of legislation that attempts to do far too much. The table of contents for HR 3200 gives you a sense of all that it is attempting to do. All these divisions, titles, and subtitles could have been broken up into individual legislation.

Finally, HR 3200 embodies what is commonly known in software engineering as a “big bang” approach to systems development. In other words, HR 3200 attempts a massive and ill-understood (and/or ill-specified) modification to the nation’s health care system (roughly 1/6th of the economy) in one fell swoop. As such, it really represents the worst excesses of the waterfall development lifecycle, with deployment being hard or impossible to reverse.

Next, I’ll talk about some of the well-established maxims and heuristics of complex systems development, and how they apply to legislation in general and to HR 3200 in particular.

Gall

As far as I can tell, John Gall — in his out-of-print book Systemantics (1976)– was the first to observe in print that

A complex system that works is found to have invariably evolved from a simple system that worked. (p. 80, 1978 paperback edition).

Immediately after, he observes that:

A complex system designed from scratch never works and cannot be made to work. You have to start over,  beginning with a working simple system.

My co-blogger (over at ASIP) Bruce Henderson puts this another way:

Start out stupid, and work up from there.

There is large room for differing arguments here as to just where HR 3200 fits in, for several reasons.

First, HR 3200 isn’t “designed from scratch.” As noted above, many sections of HR 3200 are modifying various existing laws and regulations, such as the Internal Revenue Code, the Public Health Service Act, Employee Retirement Income Security Act, the Social Security Act, and the United States Code.

However, leveraging upon and modifying several existing systems is not the same as building a “simple system that works” and evolving it into a complex system that works. I can create a large, complex piece of software that calls upon and even modifies existing systems and libraries — but that doesn’t necessarily mean I’m evolving something from a “small, simple system that works”. This is especially true when I’m pulling together from several disjoint or unrelated systems (such as those listed above).

Second, legislation is more robust than software, for exactly the differences outlined in part I, namely that legislation is executed by people rather than machines and operating systems. If I create an ill-formed piece of software, there’s a good chance it won’t even compile (or interpret); if it does, then it may run into linking or integration errors; and if it gets past those, it may crash, lock up, or behave bizarrely upon execution.

If, however, I create an ill-formed piece of legislation, it can be (and often is!) be put into practice, with various human either officially or unofficially working around the defects to make it “work”. Of course, that ‘deployment’ of the legislation may end up drifting or even veering sharply from the stated or actual intent of the legislation. (In a way, this is reminiscent of the early PL/1 compilers that would, upon encountering a syntax error, make a best guess as to what you might have meant to write and compile that instead.)

Courts can shift this ‘deployment’ in both directions. They may “find” meaning or functionality in the law never contemplated or even explicitly disavowed by those who crafted and voted for the legislation, or they may prohibit some portion of explicit functionality due to conflicts with the Constitution, prior judicial rulings, or simply their own judgment.  As noted in Part I, judges don’t always agree with one another, either, so whether a given piece of legislation (or a subportion thereof) is upheld, modified, or rejected entirely depends upon which courts or individual judges end up reviewing it.

Third, there are serious and compelling arguments as to how well the current government health care programs (such as Medicare and the VA hospital system) work, not to mention the government systems modified and relied upon by HR 3200 (such as the IRS and Social Security). While you may argue with Gall’s maxims above, I know of no serious systems designer who will state that it is possible to build a large, complex system that works from complex systems that work poorly, if at all. The quality of your original and leveraged systems provides an upper bound on the quality of your final system. To believe otherwise is to succumb to wishful thinking.

Maier and Rechtin

In The Art of Systems Architecting by Mark W. Maier and Eberhardt Rechtin (2002), the authors take a cross-discipline approach to systems architecting, including talking specifically about social systems in Chapter 5. The following passage from that chapter is of particular relevance to the overall purpose of HR 3200 (all emphasis in the original):

The first insight, which might be called the four whos, asks four questions that need to be answered as a self-consistent set if the system is to succeed economically; namely, who benefits? who pays? who provides? and, as appropriate, who loses?

The political arguments raging over HR 3200 are exactly over those four questions. In fact, Maier and Rechtin themselves foresaw those arguments, since they go on to use health care as an example:

Example: serious debates over the nature of their public health services are underway in many countries, triggered in large part by the technological advances of the last few decades. These advances have made it possible for humanity to live longer and in better health, but the investments in those gains are considerable. The answer to the four whos are at the crux of the debate. Who benefits — everyone equally at all levels of health? Who pays — regardless of personal health or based on need and ability to pay? Who provides — and determines cost to the user? Who loses — anyone out of work or above some risk level, and who determines who loses?

The problem with HR 3200 and with the arguments put forth to date on its behalf is that they have not systematically and credibly addressed those four questions. In fact, those arguing in support of HR 3200 and health care reform in general have often given contradictory answers to those four questions, undermining their own credibility, given ammo to their opposition, and (justifiably) undermining public support for HR 3200.

Along those lines, the authors also note that in architecting social systems, you face not just the constraints of normal system design — risk, performance, schedule, and cost — but two more: perception vs. facts. They go on to say:

Social systems have generated a painful design heuristic: it’s not the facts, it’s the perception that counts. Some real-world examples: . . .

  • One of the reasons that health insurance is so expensive is that health care is perceived by employees as nearly “free” because almost all its costs are paid for either by the the employee’s company or the government. The facts are that the costs are either passed on to the consumer, subtracted from wages and salary, taken as a business deduction against taxes, or all of the above. There is no free lunch.

Again, with great relevance to the current debate over HR 3200 and the whole approach of the House over health care reform, the authors state:

Like it or not, the architect must understand that perceptions can be just as real as facts, just as important in defining the system architecture, and just as critical in determining success. As one heuristic states: the phrase, ‘I hate it’, is direction. There have even been times when, in retrospect, perceptions were “truer” than facts which changed with observer, circumstance, technology, and better direction. . . . In the end, it is a matter of achieving a balance of perceived values. The architect’s task is to search out that area of common agreement that can result in a desirable, feasible system.

Maier and Rechtin end Chapter 5 with some heuristics they consider specific to social systems. Several are those already cited above, but here are a few additional ones (my comments are in brackets):

Success is the eye of the beholder [i.e., the US public] (not the architect [i.e., Congress]).

Don’t assume that the original statement of the problem [e.g., “45 million uninsured”] is necessarily the best, or even the right one. (Most customers would agree.)

In social systems, how you do something may be more important than what you do. (A sometimes bitter lesson for technologists [and Congress] to learn.)

It’s easier to change the technical elements of a social system than the human ones (enough said).

Maier and Rechtin have an entire appendix at the end of the book on heuristics for system-level architecting. Most of these are intended for software and hardware architecting; however, several have bearing for HR 3200 and the general effort for health care reform.

Plan to throw one away; you will anyway.

This comes from Fred Brooks’ classic work, The Mythical Man-Month, and appears to be highly relevant to what’s going on right now in Congress, where both conservative Democrats and Republicans are suggesting that the best approach right now would be to start over again.

In architecting a new [software] program all the serious mistakes are made in the first day.

As someone who has been dealing since 1995 with failed or troubled IT projects, I find that this is the maxim I keep coming back to. I think that the Obama Administration and the Democratic leadership in Congress badly miscalculated public support for rushing sweeping (and unexamined) health care reform into law given the profound economic problems facing the country (not to mention the massive Federal deficits).

Given a successful organization or system with valid criteria for success, there are some things it cannot do — or at least not do well. Don’t force it!

As noted above, HR 3200 is “kitchen sink” legislation, trying to accomplish a variety of changes that are not necessarily related or dependent. I suspect that Obama and Congress would have been far more successful with a series of small, focused bills that had clear goals and clear limits. The problem with HR 3200 is that by trying to cover so much ground, it merely increases the overall size of the opposition — people with objections to a specific portion of HR 3200 find themselves uniting (directly or indirectly) with those objecting to other portions of HR 3200. By recasting HR 3200 into smaller, well-defined chunks, the opposition to any given chunk becomes smaller as well, increasing that bill’s chances of passage.

Group elements that are strongly related to each other, separate elements that are unrelated.

The shorthard version of this in software design is “high cohesion within a module, loose coupling between modules”. This is another argument for breaking up health care reform into smaller, well-defined and clearly-focused chunks.

If you don’t understand the existing system, you can’t be sure you’re re-architecting a better one.

And, I might add, if you don’t understand the proposed system, you can’t be sure it’s a better one. It is unclear that most of the members of Congress who are pushing HR 3200 understand either the current US health care system or HR 3200 itself (and all its implications).

I could include many more maxims here, but you are better off getting Maier and Rechtin’s book and reading it for yourself.

[end of combined posts from 2009]

Afterword

Those of you who have read my posts on Obamacare will recognize many of the quotes and concepts above, because they apply as much to the website development effort as they do to the legislation behind it. It is not surprising, then, that the Obama Administration’s response to profound problems with both the ACA’s implementation and the Healthcare.gov systems is to push ahead, patching, waving hands, hiding actual performance and/or outright lying about how the systems are doing, rather than taking a step back and figuring out what would actually be best, what would work best, for Americans and America.

All the way back in September, Jim Geraghty of National Review was identifying Obamacare trainwrecks (see here and here). There are many more such trainwrecks to come. What remains to be see is whether the media will report them as such or merely dismiss them as the new normal.  ..bruce w..

 

 

Be Sociable, Share!

Category: 2014 Election, Creeping socialism, Healthcare Reform, Idiot bureaucrats, Idiot Congresspersons, Main, Obama Administration, Obamacare

About the Author ()

Webster is Principal and Founder at at Bruce F. Webster & Associates LLC. He works with organizations to help them with troubled or failed information technology (IT) projects. He has also worked in several dozen legal cases as a consultant and as a testifying expert, both in the United States and Japan. He can be reached at 720.895.1405 or at bwebster@bfwa.com, or you can follow him on Twitter as @bfwebster.

Comments (1)

Trackback URL | Comments RSS Feed

Sites That Link to this Post

  1. Chicago Boyz » Blog Archive » Interesting Post | December 5, 2013

Leave a Reply

You must be logged in to post a comment.