A forum to discuss ideas, approaches, standards, and architecture to establish and support open interoperability among healthcare IT systems.

Wednesday, August 17, 2011

The Road [Not] Taken

"...I took the one less traveled by,
And that has made all the difference."
Robert Frost
I had the good fortune yesterday to having been afforded the opportunity to participate in a symposium hosted by both George Washington University and UT Health entitled The Role and Future of HIT in an Era of Health Care Transformation. The experience there catalyzed me to get off my laurels and make a few comments after what has been too long of a hiatus.

In a nutshell, the symposium (http://sphhs.gwumc.edu/abouttheschool/events/healthinformationtechnology) sought to convene a community from across the health care sector to consider the principal recommendations put forward in the PCAST report. The document released last December identifying what were identified as key challenges facing the US affecting health care, and where HIT can/should be focused to achieving national objectives in terms of improving our ability to manage care, improve outcomes, and engage citizens. (Ref: http://www.whitehouse.gov/administration/eop/ostp/pcast/docsreports ).

John Halamka has already done an admirable job summarizing what happened throughout the day, so there is no need to do so again. (http://geekdoctor.blogspot.com/2011/08/role-and-future-of-hit-in-era-of-health.html ). What I thought merited attention was the approach that was taken to come to conclusions, as this particular activity chose the road not often taken.

All too often, solutions, policies, mandates, or otherwise narrowly conceived ideas are developed in a vacuum, "matured", and then unleashed onto a much broader constituency which is expected to understand, to embrace, and ultimately to support. There are a few observations about decisions made in this way:

Visibility: We can only address those problems or challenges that we're aware of. Solution development happening among a narrow community naturally has less visibility into a problem space. As ideas are vetted and potential approaches considered, they are done in the context of the problem being solved. A significant challenge that healthcare faces, particiularly when considered in a multi-institutional or National scale, is that complexities grow and there are ever increasing considerations that were not taken into account.

Buy-In: A number of years ago, Steven Wretling from Kaiser-Permanente noted in a keynote session that "Culture eats technology for breakfast." I would assert that it isn't only technology that gets eaten. When ideas are put forward that have not had sufficient vetting (or where there hasn't been an opportunity for the community affected to participate in the vetting), there is ubiquituously pushback. Building into any maturation process the means for the affected communities to engage with the decisions, to weigh in and have their thoughts considered, and to affect the outcome is key for long-term viability and success of any initiative.

The Other Road

While it is not the easiest path (or some might argue the fastest path, but we'll come back to that), a consensus-based approach where input and pushback are actively sought as part of the vetting and validation process can be a much more effective means to an end. Let's explore that notion:

Defining Consensus. The first misconception about consensus is the perception that consensus is ubiquity -- in other words, that 100% of participants agree on an outcome. While this is clearly a happy place, when it can be achieved, it is not technically necessary to achieve a consensus. A consensus, simply put, means that an overwhelming majority can stand behind a recommendation or decision.

A simple majority may be enough to technically make a group decision, but it is generally not enough for that decision to be embraced. If 51% of a constituency agrees with something, then up to 49% does not, and without clear indicators, incentives, or penalties associated, that 49% has yet to be convinced.

Let's poke at the "simple majority" point above. When we don't have consensus, there is a natural tendency for those whom were not believers to either 'wait and see' or pushback. The result is that activities (and ultimately progress) languishes even after a "decision" is reached and deployed.

A consensus-based approach involves much more compromise (and arguably a lower target) as the elements of an approach or solution are identified that can meet the needs of a much larger audience are determined. Sometimes that means leaving something out of scope. Sometimes it means compromising on a little purity. The enemy of the perfect is good enough.

Consensus cannot be about perfect. The question is whether enough common ground can be reached so that perhaps nobody is happy, but [almost] everyone can live with the outcome. Key is to ensure that whatever compromises made do not adversly affect the intent of what was being sought -- is the result still fit-for-purpose.

The difference in this approach, however, lies in what follows a decision point. The value proposition has been thought through. The compromises and tradeoffs already made. Done well, the result is not only fit-for-purpose, but also broadly supported.

The PCAST Symposium

I applaud the organizers, hosts, and participants of the GWU/UT Symposium for having taken the other road. The PCAST report has been a "disruptive" force affecting HIT, and disruption can be either negative or positive depending upon what follows that event. This event included a broad spectrum of diverse stakeholders respresenting government, payer, provider, public health, consumer advocates, industry, academia, and others.

The event was about embracing the core message that PCAST identified: that opportunities abound about how to more effectively leverage HIT to achieve national priorities, and that we must step up and address those challenges. What had been lacking, however, was the stage to convene what are multiple diverse points-of-view and interests into a common ground: to converge the community, and to identify a plan that a diverse industry could support.

While a full consensus cannot be achieved in one day and with a limited audience, what I believe was accomplished was the first step down that path. Having brought together a thoughtful and diverse community to begin to collaborate, and capturing "nuggets" in the form of suggestions that are helpful, useful, and actionable, the seeds of what could be common ground may have been sown from which a much broader consensus can grow.

Reaching that common ground takes longer than does a 51% decision. For that reason, it is often the road less travelled. In this case, however, I believe it will make all the difference.

Sunday, May 04, 2008

Interoperability in the Real World....

Over the past year, I have come to understand and appreciate the value, the importance, and the necessity of tooling in achieving health interoperability objectives. On the one hand, many would suggest that this is obvious, for we cannot build systems and deploy solutions without tools. On the other hand, I would suggest that this is profound. Here's why.

The Real World.
When I speak to folks that are working in the trenches -- the ones that have production systems that are in use and deployed, and are struggling with what new parts to buy, how to choose, and how to make those bits work with what they already have -- it becomes clear that it is all about the tooling. We can discuss the value of standards, which most would agree are good things. We identify target technologies which are the future and hold tremendous promise. We even talk about integration strategies -- the ones that will minimize impacts to the organizations while maximizing value and ROI.

I am coming to realize, though, that the above isn't the real world. [Just to be clear, I'm not denegrading any of the above, and will get into the case for these things momentarily]. What I mean by that is that those of us who need to make IT work in the real world can only do so with pieces that exist. Architecture is important. Strategic direction is important. Standards are important. Having a migration plan is important. Sequencing is important. Ultimately, a whole lot of things are important, but if you cannot touch it, feel it, use it, and deploy it, it isn't real

Standards in the Real World
Standards that go unused are un-useful. Those of you who know me know that I am passionate about standards and their incredible importance in helping Health IT achieve its promise to healthcare organizations the world over. That said, one of the core tenets behind good and useful standards work is the drive to have those standards implemented.

There has been a lot of really good standards work to emerge over the past few years in the Health IT space. One of the challenges that I believe the industry as a whole is facing is that of adoption. The vendors with whom I speak follow standards very closely, but ultimately they must make hard business decisions in terms of where to invest their product development budget, and standards are not always the top of the list. Why not? Us!

As consumers, the responsibility is ours to demand of our suppliers those things that we want and care about. If we say standards are important but continue to make purchasing decisions absent any consideration of which and what standards are supported, the fault is our own. Vendors will provide what the marketplace demands. How does this affect the users, the integrators, and the folks in the trenches? Here's how.

When we pick a new, sexy product that has the bells-and-whistles but not the core architecture, the standards, the interoperability "baked in", the onus falls on the poor systems administrator to make the things fit. The result are internal teams that are asked to perform heroic "one-offs" to glue two things together that were never intended to fit. The integration is brittle, inconsistent, and expensive. Matters are even worse when you consider maintenance releases.

So what about the vendors? Why not just make those investments themselves as a competitive advantage. Well, two reasons surface very quickly. First, if purchasers are not incenting their decisions based upon these factors, then it is pretty reasonable to assume that the competitive advantage is not there. Secondly, infrastructure is expensive. VERY expensive. To get these bits right takes a huge investment of time, cost, resources. There are definitely paybacks (such as improved time-to-market and improved ability to fit into a customer environment), but making the business case for those investments can be very problemmatic.

Moreover, infrastructure has a sustainment cost, particularly as core technologies continue to evolve and as new players enter the market. To many vendors, the business case simply isn't there (or hasn't been effectively made) to invest in the newer standards.

Enter Open Source Software
After a tremendous amount of mentoring and hand-holding over the past year or two (special thanks to Skip McGaughey for getting me straight), I have come to understand that open source software can play an instrumental role in filling this void. How? Open source software provides a platform and a cooperation allowing for this expensive infrastructure to be built via co-investment, where all the players with the need share the burden (and the spoils). The end result is an infrastructure that can support the standards the community cares about, source code that vendors can use (for free!), and most importantly, an engagement model allowing allies and competitors to collaborate on the common bits on which they are not competing.

For example, through open source, HL7 tooling is already available (with more coming soon) that can be picked up and used with no IP [intellectual property] encumbrances. It can be used for free. It can be extended for free. It can be made for-profit without patent violation. Vendors can use it. Hospital systems can use it.

When most folks think "open source", software like Linux comes to mind, or licensing such as GNU, which are not commercially friendly. The key to success in our industry is to marry the commercial and free worlds, allowing vendors to share investments but also allowing them to make a living. Caveat emptor. The code is there to use, however it adds value.

Why do we care? As consumers, we want these standards. We need our systems to interoperate. What better way to get vendors on board than to contribute to a core that the vendors can then use. They can contribute to. Contribution doesn't necessarily mean money. It may mean time -- the time of the poor guy or gal that is glueing systems together today -- only allowing that investment to solve multiple challenges instead of just the problem-du-jour.

So, what do we do?
My recommendation is that there are a few steps that organizations should consider as they embark upon major HIT initiatives:

1) Create a roadmap. You can't get to your target objective without having a sense for where that is. While the 3-5 year vision is not attainable today, there needs to be a path to get there. That path must be paved with tools and technologies that exist, particularly for those objectives you hope to achieve in the next 18 months. Either you're using something that is out there or you're building it yourself.

2) Change your purchasing behavior. If there are no ties between your purse-strings and your plan, you have no hope of incenting the marketplace to meet your needs. While no one organization can do this on its own, collaborating withing the community to drive change is viable and achievable.

3) Engage in Standards. It is really easy to take a back seat and bash the standards for not being useful, and not meeting your needs. It is far harder to be part of the solution. The hidden secret is that the way to influence standards is to show up, and a few committed people can move mountains, particularly when they are trying to achieve a real, tangible objective.

4) Consider Open Source. Open source software is not the be-all, end-all, but there are organizations (such as Open Health Tools) that have the structure to allow for community engagement and vendor participation. These efforts will create shared health IT assets that are real, that can be used, that can be implemented, and that can be deployed.

Until next time...

- Ken

Thursday, October 18, 2007

Being Conformant...

Over the past several months, I have come to a much deeper understanding around issues of conformance. Simply put, conformance is the ability to make an assertion as to some functional capability which can be rigorously and authoritatively tested and proven. Conformance is a term often bantered about when talking about standards, but I believe that it is subject to widespread variation when it comes to interpretation. Perhaps my our recent journeys in the areas of conformance can shed a little bit of insight.

Asserting Conformance

At a base level, it seems desirable (or even advantageous) to use standards to be able to say 'we use standard X', therefore anytime we need to interface we will be relying upon X. This sounds reasonable. Why, then, doesn't it work?

Most standards define qualities, attributes, or criteria which constitute conformance criteria. The intention behind these processes is to provide the foundation against which the standard may be tested, ultimately determining the compliance of any given implementation in adequately addressing the standard in question. Product vendors and organizations alike then make conformance assertions, effectively public statements declaring which standards and which versions are being supported.

This means that the standard is real, is being used, and has been implemented. The implementation has considered those qualities expressed in the conformance criteria and can adequately and sufficiently address all of the expressed criteria appropriately. Further, if the assertions are accurate, the implementation can be subjected to formal testing (either internally or by a third-party) and the implementation verified to conform. How then can all of these steps still result in non-interoperable systems, especially when the standards are being specifically developed to solve interoperability challenges?

Defining Conformance

One of the reasons it doesn't work explains the very mission behind IHE (Integrating the Healthcare Enterprise). IHE has identified, and rightly so, that many of the standards in place today lack the rigor necessary to allow for strong interoperability and conformance. In other words, it is all too easy for two different products or organizations to both be using the same version of the same standard yet not be able to interoperate. This "wiggle room" is a manifestation of ambiguity within the standard, and creates inherent challenges when interoperating.

There are many potential shortcomings that get highlighted when specifications are examined closely. In some cases, underspecification is the problem (where multiple ways to do the same thing may exist). In other cases, standards fall short by not enumerating how the standard applies to a particular situation or use case, rendering it ambiguous at best or unusable at worst.

So, through means such as profiling as IHE does, don't we have the tools to specify the ambiguous portions and result in plug-and-play interoperability, and achieve our objective of strong conformance? Not exactly.

While IHE does an admirable job of nailing-down ambiguity and creating profiles to support designated use cases, their interoperability and ultimately conformacne assertions are based upon their fitness to support the specified use case. While asserting conformance to an IHE profile, one can have a reasonable expectation of interoperability support for the use cases supported by that profile. The real challenge lies when your business need deviates from the IHE use case. Conformance outside of those use cases is no more rigorous than the base standard.

So, does IHE have it wrong? Not at all. IHE is performance a very valuable service within the industry, and is raising the interoperability bar. That said, IHE alone isn't the entire story.

Engineering Conformance

After working with this issue for several months, I have gained a new appreciation for the complexities of conformance. First off, we must consider conformance along a number of axes: technical, functional, and semantic/informational. [Actually, one can even make a strong case for a business-context as well, and we'll touch on this gently in the section below].

Technical level conformance is perhaps the easiest to understand. Usually tied to a physical or software infrastructure, such as a hardware platform or a software development or operating environment. Technical interoperability is not healthcare's biggest challenges. Far bigger issues lie ahead.

HSSP has based functional conformance around the behavior expected and performed in an interoperability setting. Functional conformance is about two interacting things doing what we expect of them. In defining functional conformance, we are expressing formally and rigorous how a system/component/service will behave. This includes its inputs, its outputs, expected behaviours, and so on. Behaviours that are not specified are inherently not supported. The converse, however, is not true, and that is where the power lies.

Semantic conformance, or information conformance, specifies the information content, construct, and formalism needed for interoperablity. By specifying information content, we can then make conformance assertions that relate to that information (and also test those assertions). For instance, if we indicate that a valid value range for a Systolic blood pressure reading cannot exceed 500, then we can verify that assertion by sending a value of 600 and expect to receive an error message indicating that a value constraint has been violated. Similar assertions can be made around terminology-based content, inter-concept relationships, and so on. In effect, this approach allows us tremendous flexibility to define and support information assertions.

Now we have the building blocks, but we're still missing something. For standards to be successful, they must provide for rigor but also be useful. Part of that utility stems from the ability to be flexible, extensible, and adaptable to a variety of needs and conditions, not all of which can be predetermined as the standard is being developed. The notion of conformance profiles is not new, but this is perhaps a bit of a different approach to them.

In order to engineer conformance, we must consider each of the interoperability perspectives independently, providing a mechansm for rigorous specification, supporting multiple use-cases, supporting extensibility within the specification, but without allowing such ambiguity so as to render conformance pointless. Let me assert that the following approach affords us that opportunity: rigorous conformance while supporting extensibility and flexibility. What would constitute such a profile? Consider the following:

A conformance profile tagged with sufficient metadata so as to be expressable and discoverable. Each profile would have at a minimum an identity, a name, and a version, allowing a vendor or user to assert conformance against it and a consumer to be able to look-it-up and verify it and its rules. A conformance profile is comprised of a functional profile, a semantic profile, and used within an interoperablity paradigm.

A functional profile that identifies the subset of the functions within the standard that are supported by that profile. In other words, the standard itself must specify behaviours so as to clearly define what is permissable within the scope of the standard. Not every function is appropriate in the context of any given profile. Within a particular conformance profile, the functional profile will enumerate exactly which functions (or which portions of those functions) apply. A functional profile allows for significant customization or localization of a standard, but it does not technically extend the standard. In other words, one cannot add new functions not previously supported in a profile. Why? What would happen if we add a function foobar within a profile, and then try to invoke it on another system. Nobody but us knows about foobar. That doesn't mean that a vendor can't do it. It means that those added functions do not fulfil conformance assertions within the standard.

A semantic profile defines the universe of potential information content, the formalism for expression, and enumerates specific information constructs that are supported within that profile. For example, a semantic profile may indicate that valid values are HL7 Version 3 RIM components, expressed in RIM terminologies. The notion of semantic signifiers (also known as "templates", "archetypes", and by a host of other names) are enumerated instances that are supported. For example, a Health Summary semantic profile may include may include semantic signifiers defining a Patient History, Patient Demographics, Vital Signs, Allergies, Medication List, and so on. Each of these signifiers would be formally and rigorously expressed, calling out where terminologies are used and what the valid values and value ranges contain.

An interoperability paradigm effectively allows one to express a context in which a conformance profile applies. For instance, an interoperability paradigm may include technical platform details, business agreements, pre-conditions (such as assumptions patient consent, authorization, authentication), and whatever other details are pertinent to the interoperability context. In effect, the interoperability paradigm is a generalization of the IHE use cases that drive their profiling activity.

What have we learned?

The analysis that led to the above concepts has yielded a number of interesting and counterintuitive discoveries. This journey embarked when we discussed what conformance meant to a generic "Retrieve, Locate, Update" service. What was found, very quickly, was that making conformance assertions about something so generic was effectively meaningless, and did not provide value when asserting, testing, or proving interoperability. The means of expression was simply too juvenile. As the thinking evolved and the understanding evolved with it, we came to the realization that the above building blocks provided tremendous flexibility and specificity. WIthin the context of any given profile, things could be strongly, formally, and rigorously defined. Underpinning the specification, however, was a core set of concepts that were more generally applicable. By incorporating a profiling mechanism as part of the base specification and including the dimensions above, new communities and use cases could be easily served within the core specification suite without compromising rigor.

Roles still exist and are desperately needed to identify profiles, to formalize them, and ultimately to test them and implementation's conformance to them. Between professional societies, standards groups, regulatory bodies, IHE, and other organizations, we believe these tools can be applied to address a significant number of challenges not yet thought of. This approach is not a silver bullet, but it provides a depth desperately needed to understand and address many of healthcare's challenges.

Until next time...

- Ken

Thursday, January 18, 2007

Creating a Culture of Success

The Healthcare Services Specification Project (HSSP -- http://hssp.wikispaces.com ) celebrated its unofficial one-year anniversary last week. One year ago, the HL7-side of the HSSP standards collaboration was formally chartered. We have learned a bunch of things during the course of the year, and made our fair share of mistakes.

Ultimately, if HSSP is to achieve its goal of creating viable healthcare SOA standards that are to realize marketplace acceptance, the group creating them must be successful. The energy around this activity has been growing throughout the course of the year, and more activities are presently in the "channel" than we ever imagined would be -- presently six active HL7 functional model specifications under development, and two OMG-issued RFPs to industry soliciting technical specifications based upon already balloted HL7 work.

The following are some of the cultural lessons-learned that I have observed within the group during the past year. I do not believe that we're doing everything right by any stretch of the imagination. That said, I think we are doing a lot of things right.

1) Don't be afraid to fail. The adage "fail early and often" couldn't be truer. We've made a bunch of mistakes along the way. That said, HSSP as a group has been pretty fearless. Ultimately, people are very forgiving folks. We screwed up getting our communications channels right (mixing two mail lists was a horrific failure). We started a little to ambitiously. We didn't have sufficient materials out there for folks to learn and understand what we were up to. The practical guidance to developers trying to solve business problems was lacking.

But we learned. We learned and improved. Problems were corrected. We persevered, and we listened. That listening has made us a better group, and we've done a good job of documenting what we do, improving as we go, and not making the same mistakes twice.

2) Keep the pressure on. Deadlines themselves are good things, not bad things. When we started, we tried to keep "Uncomfortably aggressive" timelines. Nothing so ridiculous that it wasn't plausible, but nothing that would be "comfortable" either. The result is that HSSP continued to push and get things done. Every deadline slipped. That said, I don't believe any have slipped more that one calendar-quarter. Not bad for a project with zero budget and zero dedicated resources.

3) Document your process, and seek continuous improvement. HSSP follows a formally documented process. That process started out with a lot of holes, and still has plenty. Many have been filled in. Inherent in the process, the group, and the culture has been engrained the attitude to do things better, capture those lessons, document them, and then follow our process.

Without a documented process, activities cannot be repeated. Perhaps most importantly, having the process documented means that newcomers can understand what we are doing. A colleague and HL7 attendee whom I very much appreciate and respect said of HSSP "This is [one of] the only groups where you can come into the room and understand what they are doing." The byproduct of having process rigor and operating in the daylight.

4) Let subgroups work autonomously. HSSP created a model where each subgroup must conform to the overall methodology and processes of the project, but they work autonomously. This autonomy allows each effort to establish their own dates, develop their work, meet and discuss, and ultimately custom-tailor to their needs. That said, the overall process ensures sufficient public vetting, quality assurance, and confidence that the work is consistent with the scope of HSSP activities and aligned with parallel work streams.

The upshot is that an increasing number of new work threads are emerging with new communities taking on the responsibilities. HSSP has been able to scale-out and actively collaborate with many HL7 committees, and is even entertaining discussions with newly-interested external standards bodies.

Perhaps most important, this autonomy has created diverse leadership opportunities, where anyone willing to do heavy-lifting can participate and realize business value and recognition for doing so.

5) Keep a relentless focus on business value. Many standards activities are about creating standards, and lose sight of the fact that standards must solve some problem to be useful. HSSP welcomes any point of view, but ultimately makes decisions based upon business value and resource contribution and interest. By aligning standards-development with people's day-jobs, the notion of "volunteers" largely goes away: working on the standard is working on the "day-job."

HSSP since its inception has been constantly striving to connect these two worlds, and has been enjoying slow but steady growth as a result. The business world and the standards world do not need to be in conflict. Recognizing that business needs and interests are strong motivators, there are many work threads that are synergistic to both. Striving to remain in that space is hard, but rewarding.

6) Maintain a high-trust environment. I met Jim Demetriades (Biomedical Engineer and Healthcare Architect, Veterans Health Administration) many years ago and learned a lot from him. One of the most important lessons was the impact of a low-trust vs. a high-trust environment.

In a low-trust environment, we must take on activities ourselves because if you don't, the original intent of your work, how it fits, its success, and progress can be stymied by the world at large. Collaborating is largely risk-mitigation. Ultimately, ownership is of paramount importance, for it provides the enabler to achieve your objectives.

A high-trust environment is very different. It is predicated on the fact that others are focused on their areas of interest, which may-or-may-not intersect with yours. In a high trust environment, you work in the sunlight and expect criticism and feedback that is constructive. Some is, and some isn't, but you learn from each data point on the way.

HSSP has created a culture and in my opinion a high-trust environment. There can be (and are) many very heated, juicy arguments in a high trust environment. That said, the arguments are always technically focused and about improving the work product, and almost always end with the opponents heading off together to lunch or to a watering hole. Built on mutual respect, when each of us expects and demands this from each other we can all be successful together and not at each others' expense.

7) Have fun. I think it was particularly fitting when HSSP concluded its business activities last week in passing a guiding principle that stated, very basically...

...No group would have more fun doing meaningful work than HSSP...

Hard work can be fun, and fun does not have to be irrelevant.



Until next time....

- Ken